专利摘要:
The present invention discloses a surgical image capture system that includes multiple sources of illumination, where each source emits light at a specified wavelength, a light sensor to receive the reflected light from a tissue sample illuminated by each of the lighting sources and a computing system. The computer system can receive data from the light sensor when the tissue sample is illuminated by the light sources and calculate structural data related to one or more characteristics of a tissue structure. The structural data can be a surface feature, such as a surface roughness, or a structure composition, such as a composition of collagen and elastin. The computer system can additionally transmit the structural data to an intelligent surgical device. Smart devices can include an intelligent stapler, an intelligent RF sealing device, or an intelligent ultrasonic cutting device. The system may include a controller and computer-enabled instructions for carrying out the above.
公开号:BR112020012974A2
申请号:R112020012974-7
申请日:2018-07-30
公开日:2020-11-24
发明作者:Frederick E. Shelton Iv;Jason L. Harris;Tamara Widenhouse;David C. Yates
申请人:Ethicon Llc;
IPC主号:
专利说明:

[0001] [0001] This application claims priority benefit under 35 U.S.C. 119 (e) to US Provisional Patent Application Serial No. 62 / 649,291, entitled "USE OF LASER LIGHT AND RED-GREEN-
[0002] [0002] This application also claims priority under 35 USC 119 (e) to US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed December 28, 2017, of US Provisional Patent Application serial number 62 / 611.340, entitled CLOUD-BASED MEDICAL ANALYTICS, filed on December 28, 2017 and US Provisional Patent Application serial number 62 / 611.339, entitled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28, 2017 , the description of each of which is incorporated herein by reference in its entirety. BACKGROUND OF THE INVENTION
[0003] [0003] The present description refers to several surgical systems. Surgical procedures are typically performed in operating rooms at a health care facility, such as a hospital. A sterile field is typically created around the patient. The sterile field may include members of the brushing team, who are properly dressed, and all furniture and accessories in the area. Various surgical devices and systems are used to perform a surgical procedure. SUMMARY OF THE INVENTION
[0004] [0004] In some respects, a surgical imaging system may include a plurality of light sources, where each light source is configured to emit a light that has a specified central wavelength, a light sensor configured for receiving a portion of the reflected light from a tissue sample when illuminated by one or more of the plurality of lighting sources and a computer system. The computing system is additionally configured to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of light sources, calculating the structural data related to a characteristic of a structure within the tissue sample based on in the data received by the light sensor when the tissue sample is illuminated by each of the lighting sources and transmit the structural data related to the characteristic of the structure to be received by the intelligent surgical device. The structure characteristic is a surface characteristic or a composition of the structure.
[0005] [0005] In one aspect of the surgical imaging system, the plurality of light sources may include at least one of a red light source, a green light source and a blue light source.
[0006] [0006] In one aspect of surgical imaging, the plurality of light sources may include at least one of an infrared light source and an ultraviolet light source.
[0007] [0007] In one aspect of the surgical imaging system, the computer system, configured to calculate structural data related to a characteristic of a structure within the tissue, may include a computer system configured to calculate structural data related to a composition of a structure within the tissue.
[0008] [0008] In one aspect of the surgical imaging system, the computer system configured to calculate the structural data related to a characteristic of a structure within the tissue, comprises a computer system configured to calculate structural data related to a roughness of surface of a structure within the fabric.
[0009] [0009] In some respects, a surgical imaging system may include a processor and memory attached to the processor. The memory can store instructions executable by the processor to control the operation of a plurality of light sources from a tissue sample, where each light source is configured to emit light having a specific central wavelength, to receive data from the sensor. light when the tissue sample is illuminated by each of the plurality of light sources, calculate the structural data related to a feature of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the lighting sources and transmit the structural data related to the structure characteristic to be received by the intelligent surgical device. In some respects, the structure feature can be a surface feature or a structure composition.
[0010] [0010] In one aspect of the surgical image capture system, the instructions executable by the processor to control the operation of a plurality of light sources comprise one or more instructions to illuminate the tissue sample sequentially from each of the plurality of sources lighting.
[0011] [0011] In one aspect of the surgical image capture system, the instructions executable by the processor to calculate the structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor comprise one or more instructions to calculate structural data related to a feature of a structure within the tissue sample based on a phase shift of the reflected light from the tissue sample.
[0012] [0012] In one aspect of the surgical imaging system, the structural composition may include a relative composition of collagen and elastin in a tissue.
[0013] [0013] In one aspect of the surgical imaging system, the structural composition may include an amount of hydration of a tissue.
[0014] [0014] In some respects, a surgical imaging system may include a control circuit configured to control the operation of a plurality of light sources from a tissue sample, where each light source is configured to emit a light having a specified central wavelength, receiving data from a light sensor when the tissue sample is illuminated by each of the plurality of light sources, calculating structural data related to a characteristic of a structure within the tissue sample with based on the data received by the light sensor when the tissue sample is illuminated by each of the light sources, and transmit the structural data related to the characteristic of the structure to be received by an intelligent surgical device. In some respects, the feature of the structure is a surface feature or a structure composition.
[0015] [0015] In one aspect of the surgical image capture system, the control circuit is configured to transmit structural data related to the structure's characteristic to be received by an intelligent surgical device, in which the intelligent surgical device is an intelligent surgical stapler .
[0016] [0016] In one aspect of the surgical image capture system, the control circuit is additionally configured to transmit data related to an anvil pressure based on the characteristic of the structure to be received by the intelligent surgical stapler.
[0017] [0017] In one aspect of the surgical image capture system, the control circuit is configured to transmit structural data related to the structure's characteristic to be received by an intelligent surgical device, in which the intelligent surgical device is an intelligent surgical device of RF sealing.
[0018] [0018] In one aspect of the surgical image capture system, the control circuit is additionally configured to transmit data related to an amount of RF power based on the characteristic of the structure to be received by the intelligent RF sealing device.
[0019] [0019] In one aspect of the surgical imaging system, the control circuit is configured to transmit structural data related to the characteristic of the structure to be received by an intelligent surgical device, in which the intelligent surgical device is an intelligent cutting device by ultrasound.
[0020] [0020] In one aspect of the surgical image capture system, the control circuit is additionally configured to transmit data related to the amount of power delivered to an ultrasonic transducer or a frequency of activation of the ultrasonic transducer based on the characteristic of the structure a be received by the ultrasound cutting device.
[0021] [0021] In some respects, a non-transitory, computer-readable medium can store computer-readable instructions that, when executed, cause a machine to control the operation of a plurality of lighting sources from a tissue sample, where each source of illumination is configured to emit a light that has a specified central wavelength, receive data from a light sensor when the tissue sample is illuminated by each of the plurality of light sources, calculate structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the light sources, and transmit the structural data related to the characteristic of the structure to be received by an intelligent surgical device. In some respects, the feature of the structure is a surface feature or a structure composition. FIGURES
[0022] [0022] The features of various aspects are presented with particularity in the attached claims. The various aspects, however, with regard to both the organization and the methods of operation, together with additional objects and advantages of the same, can be better understood in reference to the description presented below, considered together with the attached drawings, as follows.
[0023] [0023] Figure 1 is a block diagram of an interactive surgical system implemented by computer, according to at least one aspect of the present description.
[0024] [0024] Figure 2 is a surgical system being used to perform a surgical procedure in an operating room, in accordance with at least one aspect of the present description.
[0025] [0025] Figure 3 is a central device or "central surgical controller" paired with a visualization system, a robotic system, and an intelligent instrument, in accordance with at least one aspect of the present description.
[0026] [0026] Figure 4 is a partial perspective view of a central surgical controller compartment, and of a generator module in combination received slidingly in a central surgical controller compartment, in accordance with at least one aspect of the present description.
[0027] [0027] Figure 5 is a perspective view of a generator module in combination with bipolar, ultrasonic and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present description.
[0028] [0028] Figure 6 illustrates different power bus connectors for a plurality of side coupling ports of a side modular cabinet configured to receive a plurality of modules, in accordance with at least one aspect of the present description.
[0029] [0029] Figure 7 illustrates a vertical modular housing configured to receive a plurality of modules, in accordance with at least one aspect of the present description.
[0030] [0030] Figure 8 illustrates a surgical data network comprising a central modular communication controller configured to connect modular devices located in one or more operating rooms of a healthcare facility or any environment in a utility facility especially equipped for surgical operations, to the cloud, in accordance with at least one aspect of the present description.
[0031] [0031] Figure 9 illustrates an interactive surgical system implemented by computer, in accordance with at least one aspect of the present description.
[0032] [0032] Figure 10 illustrates a central surgical controller that comprises a plurality of modules coupled to the modular control tower, according to at least one aspect of the present description.
[0033] [0033] Figure 11 illustrates an aspect of a central controller device of universal serial bus (USB) network, according to at least one aspect of the present description.
[0034] [0034] Figure 12 illustrates a logical diagram of a control system for an instrument or surgical tool, according to at least one aspect of the present description.
[0035] [0035] Figure 13 illustrates a control circuit configured to control aspects of the instrument or surgical tool, according to at least one aspect of the present description.
[0036] [0036] Figure 14 illustrates a combinational logic circuit configured to control aspects of the instrument or surgical tool, according to at least one aspect of the present description.
[0037] [0037] Figure 15 illustrates a sequential logic circuit configured to control aspects of the instrument or surgical tool, according to at least one aspect of the present description.
[0038] [0038] Figure 16 illustrates an instrument or surgical tool that comprises a plurality of motors that can be activated to perform various functions, according to at least one aspect of the present description.
[0039] [0039] Figure 17 is a schematic diagram of a robotic surgical instrument configured to operate a surgical tool described therein, in accordance with at least one aspect of the present description.
[0040] [0040] Figure 18 illustrates a block diagram of a surgical instrument programmed to control the distal translation of the displacement member, according to an aspect of the present description.
[0041] [0041] Figure 19 is a schematic diagram of a surgical instrument configured to control various functions, in accordance with at least one aspect of the present description.
[0042] [0042] Figure 20 is a simplified block diagram of a generator configured to provide adjustment without inductor, among other benefits, according to at least one aspect of the present description.
[0043] [0043] Figure 21 illustrates an example of a generator, which is a form of the generator of Figure 20, according to at least one aspect of the present description.
[0044] [0044] Figure 22A illustrates a visualization system that can be incorporated into a surgical system, in accordance with at least one aspect of the present description.
[0045] [0045] Figure 22B illustrates a top plan view of a manual unit of the display system of Figure 22A, in accordance with at least one aspect of the present description.
[0046] [0046] Figure 22C illustrates a side plan view of the manual unit represented in Figure 22A together with an imaging sensor arranged thereon, in accordance with at least one aspect of the present description.
[0047] [0047] Figure 22D illustrates a plurality of imaging sensors represented in Figure 22C, according to at least one aspect of the present description.
[0048] [0048] Figure 23A illustrates a plurality of laser emitters that can be incorporated into the display system of Figure 22A, in accordance with at least one aspect of the present description.
[0049] [0049] Figure 23B illustrates the illumination of an image sensor that has a Bayer color filter pattern, in accordance with at least one aspect of the present description.
[0050] [0050] Figure 23C illustrates a graphical representation of the operation of a matrix of pixels for a plurality of frames, according to at least one aspect of the present description.
[0051] [0051] Figure 23D illustrates a schematic representation of an example of a sequence of operation of chrominance and luminance frames, in accordance with at least one aspect of the present description.
[0052] [0052] Figure 23E illustrates an example of sensor and emitter patterns, in accordance with at least one aspect of the present description.
[0053] [0053] Figure 23F illustrates a graphical representation of the operation of a matrix of pixels, according to at least one aspect of the present description.
[0054] [0054] Figure 24 illustrates a schematic representation of an example of instrumentation for NIR spectroscopy, according to an aspect of the present description.
[0055] [0055] Figure 25 schematically illustrates an example of instrumentation for determining NIRS based on Fourier transform infrared imaging, in accordance with at least one aspect of the present description.
[0056] [0056] Figures 26A to 26C illustrate a change in the wavelength of light spread from the movement of blood cells, in accordance with at least one aspect of the present description.
[0057] [0057] Figure 27 illustrates an aspect of the instrumentation that can be used to detect a Doppler effect in laser light scattered from portions of a tissue, in accordance with at least one aspect of the present description.
[0058] [0058] Figure 28 illustrates schematically some optical effects on the light that falls on a fabric that has subsurface structures, according to at least one aspect of the present description.
[0059] [0059] Figure 29 illustrates an example of the effects on a Doppler analysis of light that falls on a tissue sample that has subsurface structures, in accordance with at least one aspect of the present description.
[0060] [0060] Figures 30A to 30C schematically illustrate the detection of blood cells in motion in a tissue depth based on a laser Doppler analysis at a variety of laser wavelengths, in accordance with at least one aspect of the present description.
[0061] [0061] Figure 30D illustrates the effect of lighting a CMOS imaging sensor with a plurality of wavelengths of light over time, in accordance with at least one aspect of the present description.
[0062] [0062] Figure 31 illustrates an example of a use of Doppler imaging to detect the presence of subsurface blood vessels, in accordance with at least one aspect of the present description.
[0063] [0063] Figure 32 illustrates a method for identifying a subsurface blood vessel based on a Doppler effect of blue light due to the blood cells flowing through them, in accordance with at least one aspect of the present description.
[0064] [0064] Figure 33 schematically illustrates the location of a deep subsurface blood vessel, in accordance with at least one aspect of the present description.
[0065] [0065] Figure 34 schematically illustrates the location of a blood vessel of superficial subsurface, in accordance with at least one aspect of the present description.
[0066] [0066] Figure 35 illustrates a composite image comprising a surface image and an image of a subsurface blood vessel, in accordance with at least one aspect of the present description.
[0067] [0067] Figure 36 is a flow chart of a method for determining the depth of a surface feature in a piece of fabric, in accordance with at least one aspect of the present description.
[0068] [0068] Figure 37 illustrates the effect of the location and characteristics of non-vascular structures on the light falling on a tissue sample, according to at least one aspect of the present description.
[0069] [0069] Figure 38 schematically represents an example of components used in a full-field TCO device, in accordance with at least one aspect of the present description.
[0070] [0070] Figure 39 illustrates schematically the effect of anomalies in the tissue in the reflected light from a tissue sample, according to at least one aspect of the present description.
[0071] [0071] Figure 40 illustrates a display of the image derived from a combination of tissue visualization modalities, in accordance with at least one aspect of the present description.
[0072] [0072] Figures 41A to 41C illustrate various aspects of displays that can be provided to a surgeon for visual identification of a combination of tissue surface and subsurface structures at a surgical site, according to at least one aspect of the this description.
[0073] [0073] Figure 42 is a flow chart of a method for providing information related to a tissue characteristic for an intelligent surgical instrument, in accordance with at least one aspect of the present description.
[0074] [0074] Figures 43A and 43B illustrate a light sensor with multiple pixels receiving light reflected by a fabric illuminated by sequential exposure to red, green, blue and infrared light, and red, green, blue and ultraviolet laser light sources. respectively, in accordance with at least one aspect of the present description.
[0075] [0075] Figures 44A and 44B illustrate the distal end of an elongated camera probe with a single light sensor and two light sensors, respectively, according to at least one aspect of the present description.
[0076] [0076] Figure 44C illustrates a perspective view of an example of a monolithic sensor that has a plurality of pixel arrays, in accordance with at least one aspect of the present description.
[0077] [0077] Figure 45 illustrates an example of a pair of fields of view available for two image sensors from an elongated camera probe, in accordance with at least one aspect of the present description.
[0078] [0078] Figures 46A to 46D illustrate additional examples of a pair of fields of view available for two image sensors of an elongated camera probe, in accordance with at least one aspect of the present description.
[0079] [0079] Figures 47A to 47C illustrate an example of using an imaging system that incorporates the features disclosed in Figure 46D, in accordance with at least one aspect of the present description.
[0080] [0080] Figures 48A and 48B illustrate another example of the use of a double imaging system, in accordance with at least one aspect of the present description.
[0081] [0081] Figures 49A to 49C illustrate examples of a sequence of surgical steps that can benefit from the use of multiple image analysis at the surgical site, in accordance with at least one aspect of the present description.
[0082] [0082] Figure 50 is a timeline that represents the situational recognition of a central surgical controller, according to at least one aspect of the present description. DESCRIPTION
[0083] [0083] The applicant for this application holds the following provisional US patent applications, filed on March 28, 2018, each of which is incorporated herein by reference in its entirety: ● US Provisional Patent Application Serial No. 62 / 649,302,
[0084] [0084] The applicant for this application holds the following US patent applications, filed on March 29, 2018, each of which is incorporated herein by reference in its entirety: ● US Patent Application Serial No. ____________, titled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES; proxy document number END8499USNP / 170766; ● US Patent Application Serial No. ____________, entitled INTERACTIVE SURGICAL SYSTEMS WITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES; power of attorney document END8499USNP1 / 170766-1; ● US Patent Application Serial No. ____________, entitled SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES; power of attorney document number END8499USNP2 / 170766-2; ● US Patent Application Serial No. ____________, entitled SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS; power of attorney document number END8499USNP3 / 170766-3;
[0085] [0085] The applicant for this application holds the following US patent applications, filed on March 29, 2018, each of which is incorporated herein by reference in its entirety: ● US Patent Application Serial Number ____________,
[0086] [0086] The applicant for this application holds the following US patent applications, filed on March 29, 2018, each of which is incorporated herein by reference in its entirety: ● US Patent Application Serial No. ____________, entitled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; proxy document number END8511USNP / 170778; ● US Patent Application Serial No. ____________, entitled COMMUNICATION ARRANGEMENTS FOR ROBOT- ASSISTED SURGICAL PLATFORMS; proxy document number END8511USNP1 / 170778-1; ● US Patent Application Serial No. ____________, entitled CONTROLS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; power of attorney document END8511USNP2 / 170778-2; ● US Patent Application Serial No. ____________, entitled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT- ASSISTED SURGICAL PLATFORMS; power of attorney document END8512USNP / 170779; ● US Patent Application Serial No. ____________, entitled CONTROLLERS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; power of attorney document number END8512USNP1 / 170779-1; ● US Patent Application Serial No. ____________, entitled COOPERATIVE SURGICAL ACTIONS FOR ROBOT- ASSISTED SURGICAL PLATFORMS; power of attorney document END8512USNP2 / 170779-2;
[0087] [0087] Before explaining in detail the various aspects of surgical instruments and generators, it should be noted that the illustrative examples are not limited, in terms of application or use, to the details of construction and arrangement of parts illustrated in the drawings and description attached. Illustrative examples can be implemented or incorporated into other aspects, variations and modifications, and can be practiced or performed in a variety of ways. Furthermore, except where otherwise indicated, the terms and expressions used in the present invention were chosen for the purpose of describing illustrative examples for the convenience of the reader and not for the purpose of limiting it. In addition, it should be understood that one or more of the aspects, expressions of aspects, and / or examples described below can be combined with any one or more of the other aspects, expressions of aspects and / or examples described below.
[0088] [0088] Referring to Figure 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (e.g., cloud 104 which may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one central surgical controller 106 in communication with the cloud 104 which can include a remote server 113. In one example, as illustrated in Figure 1, surgical system 102 includes a visualization system 108, a robotic system 110, a smart handheld surgical instrument 112, which are configured to communicate with one another and / or the central controller 106. In some respects, a surgical system 102 may include a number of central controllers M 106, an N number of visualization systems 108, an O number of robotic systems 110, and a P number of smart, hand-held surgical instruments 112, where M, N, O, and P are whole numbers greater than or equal to one.
[0089] [0089] Figure 3 represents an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the surgical procedure as a part of surgical system 102. Robotic system 110 includes a surgeon console 118, a patient car 120 (surgical robot), and a robotic central surgical controller
[0090] [0090] Other types of robotic systems can readily be adapted for use with the surgical system 102. Various examples of robotic systems and surgical instruments that are suitable for use with the present description are described in provisional patent application serial number 62 / 611,339, entitled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28, 2017, whose description is hereby incorporated by reference in its entirety.
[0091] [0091] Several examples of cloud-based analysis that are performed by the cloud 104, and are suitable for use with the present description, are described in US Provisional Patent Application Serial No. 62 / 611.340, entitled CLOUD-BASED MEDICAL ANALYTICS , filed on December 28, 2017, the description of which is incorporated herein by reference, in its entirety.
[0092] [0092] In several respects, the imaging device 124 includes at least one Image sensor and one or more optical components. Suitable image sensors include, but are not limited to, load-coupled device (CCD) sensors and complementary metal oxide semiconductor (CMOS) sensors.
[0093] [0093] The optical components of the imaging device 124 may include one or more light sources and / or one or more lenses. One or more light sources can be directed to illuminate portions of the surgical field. The one or more image sensors can receive reflected or refracted light from the surgical field, including reflected or refracted light from tissue and / or surgical instruments.
[0094] [0094] The one or more light sources can be configured to radiate electromagnetic energy in the visible spectrum, as well as in the invisible spectrum. The visible spectrum, sometimes called the optical spectrum or light spectrum, is that portion of the electromagnetic spectrum that is visible to (that is, can be detected by) the human eye and can be called visible light or simply light. A typical human eye will respond to wavelengths in the air that are from about 380 nm to about 750 nm.
[0095] [0095] The invisible spectrum (that is, the non-luminous spectrum) is that portion of the electromagnetic spectrum located below and above the visible spectrum (that is, wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the visible red spectrum, and they become invisible infrared (IR), microwaves, radio and electromagnetic radiation. Wavelengths shorter than about 380 nm are shorter than the ultraviolet spectrum, and they become invisible ultraviolet, x-ray, and electromagnetic gamma-ray radiation.
[0096] [0096] In several respects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present description include, but are not limited to, an arthroscope, angioscope, bronchoscope, choledocoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagus-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngoscope neproscope, sigmoidoscope, thoracoscope, and ureteroscope.
[0097] [0097] In one aspect, the imaging device employs multiple spectrum monitoring to discriminate topography and underlying structures. A multispectral image is one that captures image data within wavelength bands across the electromagnetic spectrum. Wavelengths can be separated by filters or using instruments that are sensitive to specific wavelengths, including light from frequencies beyond the visible light range, for example, IR and ultraviolet light. Spectral images can allow the extraction of additional information that the human eye cannot capture with its receivers for the colors red, green, and blue. The use of multispectral imaging is described in more detail under the heading "Advanced Imaging Acquisition Module" in the Patent Application
[0098] [0098] It is axiomatic that strict sterilization of the operating room and surgical equipment is necessary during any surgery. The strict hygiene and sterilization conditions required in an "operating room", that is, an operating or treatment room, justify the highest possible sterilization of all medical devices and equipment. Part of this sterilization process is the need to sterilize anything that comes into contact with the patient or enters the sterile field, including imaging device 124 and its connectors and components. It will be understood that the sterile field can be considered a specified area, such as inside a tray or on a sterile towel, which is considered free of microorganisms, or the sterile field can be considered an area, immediately around a patient, who was prepared to perform a surgical procedure. The sterile field may include members of the brushing team, who are properly dressed, and all furniture and accessories in the area.
[0099] [0099] In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays and one or more screens that are strategically arranged in relation to the sterile field, as shown in Figure 2. In one aspect, the display system 108 includes an interface for HL7, PACS and EMR. Various components of the visualization system 108 are described under the heading "Advanced Imaging Acquisition Module" in US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, the description of which is incorporated herein as a reference in its entirety.
[0100] [0100] As shown in Figure 2, a primary screen 119 is positioned in the sterile field to be visible to the operator on the operating table 114. In addition, a viewing tower 111 is positioned outside the sterile field. The display tower 111 includes a first non-sterile screen 107 and a second non-sterile screen 109, which are opposite each other. The visualization system 108, guided by the central controller 106, is configured to use screens 107, 109, and 119 to coordinate the flow of information to operators inside and outside the sterile field. For example, the central controller 106 can have the visualization system 108 display a snapshot of a surgical site, as recorded by an imaging device 124, on a non-sterile screen 107 or 109, while maintaining a live transmission of the surgical site on primary screen 119. Snapshot on non-sterile screen 107 or 109 may allow a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example.
[0101] [0101] In one aspect, central controller 106 is also configured to route a diagnostic input or feedback by a non-sterile operator in the display tower 111 to the primary screen 119 within the sterile field, where it can be seen by a sterile operator on the operating table. In one example, the entry may be in the form of a modification of the snapshot displayed on the non-sterile screen 107 or 109, which can be routed to primary screen 119 by central controller 106.
[0102] [0102] Referring to Figure 2, a 112 surgical instrument is being used in the surgical procedure as part of the surgical system
[0103] [0103] Now with reference to Figure 3, a central controller 106 is shown in communication with a display system 108, a robotic system 110 and a smart handheld surgical instrument 112. Central controller 106 includes a central controller screen 135, an imaging module 138, a generator module 140, a communication module 130, a processor module 132 and a storage matrix 134. In certain respects, as shown in Figure 3, central controller 106 additionally includes a smoke evacuation module 126 and / or a suction / irrigation module 128.
[0104] [0104] During a surgical procedure, the application of energy to the tissue, for sealing and / or cutting, is generally associated with the evacuation of smoke, suction of excess fluid and / or irrigation of the tissue. Fluid, power, and / or data lines from different sources are often intertwined during the surgical procedure. Valuable time can be wasted in addressing this issue during a surgical procedure. To untangle the lines, it may be necessary to disconnect the lines from their respective modules, which may require a restart of the modules. The central compartment of the central controller 136 offers a unified environment for managing power, data and fluid lines, which reduces the frequency of entanglement between such lines.
[0105] [0105] Aspects of the present description feature a central surgical controller for use in a surgical procedure that involves applying energy to tissue at a surgical site. The central surgical controller includes a central controller compartment and a combination generator module received slidingly at a central controller compartment docking station. The docking station includes data and power contacts. The combined generator module includes two or more of an ultrasonic energy generating component, a bipolar RF energy generating component, and a monopolar RF energy generating component which are housed in a single unit. In one aspect, the combined generator module also includes a smoke evacuation component, at least one power application cable to connect the combined generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid , and / or particulates generated by applying therapeutic energy to the tissue, and a fluid line that extends from the remote surgical site to the smoke evacuation component.
[0106] [0106] In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module received slidingly in the central controller compartment. In one aspect, the central controller compartment comprises a fluid interface.
[0107] [0107] Certain surgical procedures may require the application of more than one type of energy to the tissue. One type of energy may be more beneficial for cutting the fabric, while another type of energy may be more beneficial for sealing the fabric. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present description present a solution in which a modular compartment of the central controller 136 is configured to accommodate different generators and facilitate interactive communication between them. One of the advantages of the central modular compartment 136 is that it allows quick removal and / or replacement of several modules.
[0108] [0108] Aspects of the present description feature a modular surgical compartment for use in a surgical procedure that involves applying energy to the tissue. The modular surgical compartment includes a first energy generator module, configured to generate a first energy for application to the tissue, and a first docking station that comprises a first docking port that includes the first data and power contacts, in which the the first power generator module is slidably movable in an electric coupling with the data and power contacts, and the first power generator module is slidably movable out of the electric coupling with the first data and power contacts.
[0109] [0109] In addition to the above, the modular surgical compartment also includes a second energy generator module configured to generate a second energy, different from the first energy, for application to the tissue, and a second docking station comprising a second docking port which includes second data and power contacts, in which the second power generator module is slidably movable in an electrical coupling with the power and data contacts, and in which the second energy generator module is slidably movable out of the electrical coupling with the second power and data contacts.
[0110] [0110] In addition, the modular surgical compartment also includes a communication bus between the first coupling port and the second coupling port, configured to facilitate communication between the first energy generating module and the second energy generating module.
[0111] [0111] With reference to Figures 3 to 7, aspects of the present description are presented for a modular compartment of the central controller 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126 and a suction / irrigation module 128. The central controller 136 modular compartment further facilitates interactive communication between modules 140, 126, 128. As illustrated in Figure 5, generator module 140 can be a generator module with integrated monopolar, bipolar and ultrasonic components, supported in a single cabinet unit 139 slidably insertable into the central modular compartment 136. As shown in Figure 5, generator module 140 can be configured to connect to a monopolar device 146, a bipolar device 147 and an ultrasonic device 148. Alternatively, generator module 140 may comprise a series of monopolar, bipolar and / or ultrasonic generator modules that interact at through the central modular compartment 136. The central modular compartment 136 can be configured to facilitate the insertion of multiple generators and interactive communication between the generators anchored in the central modular compartment 136 so that the generators would act as a single generator.
[0112] [0112] In one aspect, the central modular compartment 136 comprises a modular power and a rear communication board 149 with external and wireless communication heads to allow removable fixing of modules 140, 126, 128 and interactive communication between them.
[0113] [0113] In one aspect, the central modular compartment 136 includes docking stations, or drawers, 151, here also called drawers, which are configured to receive modules 140, 126, 128 in a sliding manner. Figure 4 illustrates a view in partial perspective of a surgical compartment of the central surgical controller 136, and a combined generator module 145 received slidingly in a docking station 151 of the central surgical controller compartment 136. A docking port 152 with power and data contacts in one rear side of the combined generator module 145 is configured to engage a corresponding docking port 150 with the power and data contacts of a corresponding docking station 151 from the central controller modular housing 136 as the combined generator module 145 is slid into position corresponding docking station 151 of the central controller 136 modular compartment. In one aspect, the module the combined generator 145 includes a bipolar, ultrasonic and monopolar module and a smoke evacuation module integrated in a single compartment unit 139, as shown in Figure 5.
[0114] [0114] In several respects, the smoke evacuation module 126 includes a fluid line 154 that carries captured / collected smoke fluid away from a surgical site and to, for example, the smoke evacuation module 126. Suction a vacuum that originates from the smoke evacuation module 126 can pull the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube that ends in the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path that extends towards the smoke evacuation module 126 which is received in the central controller compartment
[0115] [0115] In several respects, the suction / irrigation module 128 is coupled to a surgical tool comprising a fluid suction line and a fluid suction line. In one example, the suction and suction fluid lines are in the form of flexible tubes that extend from the surgical site towards the suction / irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site.
[0116] [0116] In one aspect, the surgical tool includes a drive shaft that has an end actuator at a distal end of the same and at least an energy treatment associated with the end actuator, a suction tube, and a suction tube. irrigation. The suction tube can have an inlet port at a distal end of it and the suction tube extends through the drive shaft. Similarly, an irrigation pipe can extend through the drive shaft and may have an entrance port close to the power application implement. The power application implement is configured to deliver ultrasonic and / or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the drive shaft.
[0117] [0117] The irrigation tube can be in fluid communication with a fluid source, and the suction tube can be in fluid communication with a vacuum source. The fluid source and / or the vacuum source can be housed in the suction / irrigation module 128. In one example, the fluid source and / or the vacuum source can be housed in the central controller compartment 136 separately from the control module. suction / irrigation 128. In such an example, a fluid interface can be configured to connect the suction / irrigation module 128 to the fluid source and / or the vacuum source.
[0118] [0118] In one aspect, modules 140, 126, 128 and / or their corresponding docking stations in the central modular compartment 136 may include alignment features that are configured to align the docking ports of the modules in engagement with their counterparts at the stations coupling module of the central modular compartment 136. For example, as shown in Figure 4, the combined generator module 145 includes side brackets 155 that are configured to slide the corresponding brackets 156 of the corresponding docking station 151 of the central modular compartment 136 slidably. The brackets cooperate to guide the coupling port contacts of the combined generator module 145 in an electrical coupling with the coupling port contacts of the central modular compartment 136.
[0119] [0119] In some respects, the drawers 151 of the central modular compartment 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers 151. For example, the side brackets 155 and / or 156 can be larger or smaller depending on the size of the module. In other respects, drawers 151 are different in size and are each designed to accommodate a specific module.
[0120] [0120] In addition, the contacts of a specific module can be switched to engage with the contacts of a specific drawer to avoid inserting a module in a drawer with unpaired contacts.
[0121] [0121] As shown in Figure 4, the docking port 150 of one drawer 151 can be coupled to the docking port 150 of another drawer 151 via a communication link 157 to facilitate interactive communication between the modules housed in the modular compartment central 136. The coupling ports 150 of the central modular compartment 136 can, alternatively or additionally, facilitate interactive wireless communication between the modules housed in the central modular compartment 136. Any suitable wireless communication can be used, for example, Air Titan Bluetooth.
[0122] [0122] Figure 6 illustrates individual power bus connectors for a plurality of side coupling ports of a side modular compartment 160 configured to receive a plurality of modules from a central surgical controller 206. Side modular compartment 160 is configured to receive and laterally interconnect modules 161. Modules 161 are slidably inserted into docking stations 162 of side modular compartment 160, which includes a back plate for interconnecting modules 161. As shown in Figure 6, modules 161 are arranged laterally in the side modular cabinet
[0123] [0123] Figure 7 illustrates a vertical modular cabinet 164 configured to receive a plurality of modules 165 from the central surgical controller 106. The modules 165 are slidably inserted into docking stations, or drawers, 167 of the vertical modular cabinet 164, the which includes a rear panel for interconnecting modules 165. Although the drawers 167 of the vertical modular cabinet 164 are arranged vertically, in certain cases, a vertical modular cabinet 164 may include drawers that are arranged laterally. In addition, modules 165 can interact with each other through the coupling ports of the vertical modular cabinet
[0124] [0124] In several respects, imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular compartment that can be mounted with a light source module and a camera module. The compartment can be a disposable compartment. In at least one example, the disposable compartment is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and / or the camera module can be selected selectively depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for imaging the scanned beam. Similarly, the light source module can be configured to provide a white light or a different light, depending on the surgical procedure.
[0125] [0125] During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or other light source may be inefficient. Temporarily losing sight of the surgical field can lead to undesirable consequences. The imaging device module of the present description is configured to allow the replacement of a light source module or a "midstream" camera module during a surgical procedure, without the need to remove the imaging device from the surgical field.
[0126] [0126] In one aspect, the imaging device comprises a tubular compartment that includes a plurality of channels. A first channel is configured to receive the Camera module in a sliding way, which can be configured for a snap-fit fit (pressure fit) with the first channel. A second channel is configured to slide the camera module, which can be configured for a snap-fit fit (pressure fit) with the first channel. In another example, the camera module and / or the light source module can be rotated to an end position within their respective channels. A threaded coupling can be used instead of a pressure fitting.
[0127] [0127] In several examples, multiple imaging devices are placed in different positions in the surgical field to provide multiple views. Imaging module 138 can be configured to switch between imaging devices to provide an ideal view. In several respects, imaging module 138 can be configured to integrate images from different imaging devices.
[0128] [0128] Various image processors and imaging devices suitable for use with the present description are described in US Patent No. 7,995,045 entitled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, granted on August 9, 2011 which is incorporated herein as reference in its entirety. In addition, US patent No. 7,982,776, entitled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, issued on July 19, 2011, which is incorporated herein by reference in its entirety, describes various systems for removing motion artifacts from image data. Such systems can be integrated with the imaging module 138. In addition to these, US Patent Application Publication No. 2011/0306840, entitled CONTROLLABLE
[0129] [0129] Figure 8 illustrates a surgical data network 201 comprising a central modular communication controller 203 configured to connect modular devices located in one or more operating rooms of a healthcare facility, or any environment in a healthcare facility. audiences specially equipped for surgical operations, to a cloud-based system (for example, cloud 204 which may include a remote server 213 coupled to a storage device 205). In one aspect, the modular communication central controller 203 comprises a central network controller 207 and / or a network key 209 in communication with a network router. The central modular communication controller 203 can also be coupled to a local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 can be configured as a passive, intelligent, or switching network. A passive surgical data network serves as a conduit for the data, allowing the data to be transmitted from one device (or segment) to another and to cloud computing resources. An intelligent surgical data network includes features to allow traffic to pass through the surgical data network to be monitored and to configure each port on the central network controller 207 or network key 209. An intelligent surgical data network can be called a a central controller or controllable key. A central switching controller reads the destination address of each packet and then forwards the packet to the correct port.
[0130] [0130] Modular devices 1a to 1n located in the operating room can be coupled to the central controller of modular communication 203. The central network controller 207 and / or the network switch 209 can be coupled to a network router 211 to connect devices 1a through 1n to the 204 cloud or the local computer system
[0131] [0131] It will be understood that the surgical data network 201 can be expanded by interconnecting multiple central network controllers 207 and / or multiple network switches 209 with multiple network routers 211. The modular communication center 203 may be contained in a modular control roaster configured to receive multiple devices 1a to 1n / 2a to 2m. The local computer system 210 can also be contained in a modular control tower. The modular communication center 203 is connected to a screen 212 to display the images obtained by some of the devices 1a to 1n / 2a to 2m, for example, during surgical procedures. In several respects, devices 1a to 1n / 2a to 2m can include, for example, several modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, an evacuation module smoke 126, a suction / irrigation module 128, a communication module 130, a processor module 132, a storage matrix 134, a surgical device attached to a screen, and / or a non-contact sensor module, among others modular devices that can be connected to the modular communication center 203 of the surgical data network 201.
[0132] [0132] In one aspect, the surgical data network 201 may comprise a combination of central network controllers, network switches, and network routers that connect devices 1a to 1n / 2a to 2m to the cloud. Any or all of the devices 1a to 1n / 2a to 2m coupled to the central network controller or network key can collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be understood that cloud computing depends on sharing computing resources instead of having local servers or personal devices to handle software applications. The word "cloud" can be used as a metaphor for "the Internet", although the term is not limited as such. Consequently, the term "cloud computing" can be used here to refer to "a type of Internet-based computing", in which different services - such as servers, storage, and applications - are applied to the modular communication center 203 and / or computer system 210 located in the operating room (for example, a fixed, mobile, temporary, or operating room or operating space) and devices connected to modular communication center 203 and / or computer system 210 over the Internet . The cloud infrastructure can be maintained by a cloud service provider. In this context, the cloud service provider may be the entity that coordinates the use and control of devices 1a to 1n / 2a to 2m located in one or more operating rooms. Cloud computing services can perform a large number of calculations based on data collected by smart surgical instruments, robots, and other computerized devices located in the operating room. The central controller hardware allows multiple devices or connections to be connected to a computer that communicates with cloud computing and storage resources.
[0133] [0133] The application of cloud computer data processing techniques to data collected by devices 1a to 1n / 2a to 2m, the surgical data network provides better surgical results, reduced costs, and better patient satisfaction. At least some of the devices 1a to 1n / 2a to 2m can be used to view tissue status to assess leakage or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the devices 1a to 1n / 2a to 2m can be used to identify the pathology, such as the effects of disease, with the use of cloud-based computing to examine data including images of body tissue samples for diagnostic purposes. This includes confirmation of the location and margin of the tissue and phenotypes. At least some of the devices 1a to 1n / 2a to 2m can be used to identify anatomical structures of the body using a variety of sensors integrated with imaging devices and techniques such as overlaying images captured by multiple imaging devices. The data collected by devices 1a to 1n / 2a to 2m, including the image data, can be transferred to the cloud 204 or the local computer system 210 or both for data processing and manipulation including image processing and manipulation. The data can be analyzed to improve the results of the surgical procedure by determining whether additional treatment, such as application of endoscopic intervention, emerging technologies, targeted radiation, targeted intervention, accurate robotics at specific tissue sites and conditions, can be followed. This data analysis can additionally use analytical processing of the results, and with the use of standardized approaches they can provide beneficial standardized feedback both to confirm surgical treatments and the surgeon's behavior or to suggest changes to surgical treatments and the surgeon's behavior.
[0134] [0134] In an implementation, operating room devices 1a to 1n can be connected to the modular communication center 203 via a wired channel or a wireless channel depending on the configuration of devices 1a to 1n on a central network controller . The central network controller 207 can be implemented, in one aspect, as a local network transmission device that acts on the physical layer of the OSI model ("open system interconnection"). The central network controller provides connectivity to devices 1a to 1n located on the same network as the operating room. The central network controller 207 collects data in the form of packets and sends it to the router in "half-
[0135] [0135] In another implementation, operating room devices 2a to 2m can be connected to a network switch 209 via a wired or wireless channel. The network key 209 works in the data connection layer of the OSI model. The network switch 209 is a multicast device for connecting devices 2a to 2m located in the same operation center to the network. The network key 209 sends data in frame form to the network router 211 and works in full duplex mode. Multiple devices 2a to 2m can send data at the same time via network key 209. Network key 209 stores and uses MAC addresses of devices 2a to 2m to transfer data.
[0136] [0136] The central network controller 207 and / or the network key 209 are coupled to the network router 211 for a connection to the cloud
[0137] [0137] In one example, the central network controller 207 can be implemented as a central USB controller, which allows multiple USB devices to be connected to a host computer. The central USB controller can expand a single USB port on several levels so that more ports are available to connect the devices to the system's host computer. The central network controller 207 can include wired or wireless capabilities to receive information about a wired channel or a wireless channel. In one aspect, a wireless wireless, broadband and short-range wireless USB communication protocol can be used for communication between devices 1a to 1n and devices 2a to 2m located in the operating room.
[0138] [0138] In other examples, devices from the operating room 1a to 1n / 2a to 2m can communicate with the modular communication center 203 via standard Bluetooth wireless technology for exchanging data over short distances (using short-wavelength UHF radio waves in the 2.4 to 2.485 GHz ISM band) from fixed and mobile devices and to build personal area networks (PANs). In other respects, operating room devices 1a to 1n / 2a to 2m can communicate with the modular communication center 203 through a number of wireless and wired communication standards or protocols, including, but not limited to , Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, long-term evolution (LTE, "long-term evolution"), and Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, GPRS, CDMA, TDMA, DECT, and Ethernet derivatives thereof, as well as any other wireless and wired protocols that are designated as 3G, 4G, 5G, and beyond. The computing module can include a plurality of communication modules. For example, a first communication module can be dedicated to short-range wireless communications like Wi-Fi and Bluetooth, and a second communication module can be dedicated to longer-range wireless communications like GPS, EDGE, GPRS, CDMA , WiMAX, LTE, Ev-DO, and others.
[0139] [0139] The modular communication center 203 can serve as a central connection for one or all operating room devices 1a to 1n / 2a to 2m and handles a data type known as frames. The tables carry the data generated by the devices 1a to 1n / 2a to 2m. When a frame is received by the modular communication center 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources using a series of wireless communication standards or protocols or with wire, as described in the present invention.
[0140] [0140] The modular communication center 203 can be used as a standalone device or be connected to compatible central network controllers and network switches to form a larger network. The modular communication center 203 is, in general, easy to install, configure and maintain, making it a good option for the network of devices 1a to 1n / 2a to 2m from the operating room.
[0141] [0141] Figure 9 illustrates an interactive surgical system, implemented by computer 200. The interactive surgical system implemented by computer 200 is similar in many ways to the interactive surgical system, implemented by computer 100. For example, the interactive, implemented, surgical system per computer 200 includes one or more surgical systems 202, which are similar in many respects to surgical systems 102. Each surgical system 202 includes at least one central surgical controller 206 in communication with a cloud 204 which may include a remote server
[0142] [0142] Figure 10 illustrates a central surgical controller 206 that comprises a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a modular communication center 203, for example, a network connectivity device, and a computer system 210 for providing local processing, visualization, and imaging, for example. As shown in Figure 10, the modular communication center 203 can be connected in a layered configuration to expand the number of modules (for example, devices) that can be connected to the modular communication center 203 and transfer data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in Figure 10, each of the central controllers / network switches in the modular communication center 203 includes three downstream ports and one upstream port. The central controller / network switch upstream is connected to a processor to provide a communication connection to the cloud computing resources and a local display 217. Communication with the cloud 204 can be done via a wired communication channel or wireless.
[0143] [0143] The central surgical controller 206 employs a non-contact sensor module 242 to measure the dimensions of the operating room and generate a map of the operating room using non-contact measuring devices such as laser or ultrasonic. An ultrasound-based non-contact sensor module scans the operating room by transmitting an ultrasound explosion and receiving the echo when it bounces outside the perimeter of the operating room walls, as described under the heading "Surgical Hub Spatial Awareness Within an Operating Room "in US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, whose description is hereby incorporated by reference in its entirety, in which the sensor is configured to determine the size of the operating room and adjust the Bluetooth pairing distance limits. A laser-based non-contact sensor module scans the operating room by transmitting pulses of laser light, receiving pulses of laser light that bounce off the perimeter walls of the operating room, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating room and to adjust the Bluetooth pairing distance limits, for example.
[0144] [0144] Computer system 210 comprises a processor 244 and a network interface 245. Processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and an input / output interface 251 through of a system bus. The system bus can be any of several types of bus structures, including the memory bus or memory controller, a peripheral bus or external bus, and / or a local bus that uses any variety of available bus architectures including, but not limited to, not limited to, 9-bit bus, industry standard architecture (ISA), Micro-Charmel Architecture (MSA), extended ISA (EISA), smart drive electronics (IDE), VESA local bus (VLB), component interconnection peripherals (PCI), USB, accelerated graphics port (AGP), PCMCIA bus (International Personal Computer Memory Card Association, "Personal Computer Memory Card International Association"), Small Computer Systems Interface (SCSI), or any another proprietary bus.
[0145] [0145] Processor 244 can be any single-core or multi-core processor, such as those known under the ARM Cortex trade name available from Texas Instruments. In one respect, the processor may be a Core Cortex-M4F LM4F230H5QR ARM processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz , a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWare® program, memory only programmable and electrically erasable reading (EEPROM) of 2 KB, one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, one or more analog to digital converters (ADC) 12-bit with 12 channels of analog input, details of which are available for the product data sheet.
[0146] [0146] In one aspect, processor 244 may comprise a safety controller comprising two controller-based families, such as TMS570 and RM4x, known under the tradename Hercules ARM Cortex R4, also by Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options.
[0147] [0147] System memory includes volatile and non-volatile memory. The basic input / output system (BIOS), containing the basic routines for transferring information between elements within the computer system, such as during startup, is stored in non-volatile memory. For example, non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM or flash memory. Volatile memory includes random access memory (RAM), which acts as an external cache memory. In addition, RAM is available in many forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct RAM Rambus RAM (DRRAM).
[0148] [0148] Computer system 210 also includes removable / non-removable, volatile / non-volatile computer storage media, such as disk storage. Disk storage includes, but is not limited to, devices such as a magnetic disk drive, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card or memory stick (pen drive). drive). In addition, the storage disc may include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM (CD-ROM) device recordable (CD-R Drive), rewritable compact disc drive (CD-RW drive), or a versatile digital ROM drive (DVD-ROM). To facilitate the connection of disk storage devices to the system bus, a removable or non-removable interface can be used.
[0149] [0149] It is to be understood that computer system 210 includes software that acts as an intermediary between users and the basic computer resources described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on disk storage, acts to control and allocate computer system resources. System applications benefit from the management capabilities of the operating system through program modules and program data stored in system memory or on the storage disk. It is to be understood that the various components described in the present invention can be implemented with various operating systems or combinations of operating systems.
[0150] [0150] A user enters commands or information into computer system 210 through the input device (s) coupled to the I / O interface 251. Input devices include, but are not limited to, a device pointer such as a mouse, trackball, stylus, touchpad, keyboard, microphone, joystick, game pad, satellite card, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor via the system bus via the interface port (s). The interface ports include, for example, a serial port, a parallel port, a game port and a USB. Output devices use some of the same types of ports as input devices. In this way, for example, a USB port can be used to provide input to the computer system and to provide information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices such as monitors, screens, speakers, and printers, among other output devices, that need special adapters. Output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and / or device systems, such as remote computers, provide input and output capabilities.
[0151] [0151] Computer system 210 can operate in a networked environment using logical connections to one or more remote computers, such as cloud computers, or local computers. Remote cloud computers can be a personal computer, server, router, personal network computer, workstation, microprocessor-based device, peer device, or other common network node, and the like, and typically include many or all elements described in relation to the computer system. For the sake of brevity, only one memory storage device is illustrated with the remote computer. Remote computers are logically connected to the computer system via a network interface and then physically connected via a communication connection. The network interface covers communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include fiber distributed data interface (FDDI), copper distributed data interface (CDDI), Ethernet / IEEE 802.3, Token / IEEE 802.5 ring and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks such as digital integrated service networks (ISDN) and variations in them, packet switching networks and digital subscriber lines (DSL).
[0152] [0152] In several respects, computer system 210 of Figure 10, imaging module 238 and / or display system 208, and / or processor module 232 of Figures 9 to 10, may comprise an image processor, image processing engine, media processor or any specialized digital signal processor (PSD) used for processing digital images. The image processor can employ parallel computing with single multi-data instruction (SIMD) or multiple multi-data instruction (MIMD) technologies to increase speed and efficiency. The digital image processing engine can perform a number of tasks. The image processor can be an integrated circuit system with a multi-core processor architecture.
[0153] [0153] Communication connections refer to the hardware / software used to connect the network interface to the bus. Although the communication connection is shown for illustrative clarity within the computer system, it can also be external to computer system 210. The hardware / software required for connection to the network interface includes, for illustrative purposes only, internal and external technologies such as modems, including regular telephone serial modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards.
[0154] [0154] Figure 11 illustrates a functional block diagram of an aspect of a USB 300 central network controller device, in accordance with an aspect of the present description. In the illustrated aspect, the USB 300 network controller device uses a TUSB2036 integrated circuit central controller available from Texas Instruments. The central USB network controller 300 is a CMOS device that provides one USB transceiver port 302 and up to three USB transceiver ports downstream 304, 306, 308 in accordance with the USB 2.0 specification. Upstream USB transceiver port 302 is a differential data root port comprising a "minus" (DM0) differential data input paired with a "plus" (DP0) differential data input. The three ports of the downstream USB transceiver 304, 306, 308 are differential data ports, where each port includes "more" differential data outputs (DP1-DP3) paired with "less" differential data outputs (DM1-DM3) .
[0155] [0155] The USB 300 central network controller device is implemented with a digital state machine instead of a microcontroller, and no firmware programming is required. Fully compatible USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full speed as low speed automatically configuring the scan rate according to the speed of the device attached to the doors. The USB 300 network central controller device can be configured in bus powered or self powered mode and includes 312 central power logic to manage power.
[0156] [0156] The USB 300 network central controller device includes a 310 series interface engine (SIE). The SIE 310 is the front end of the USB 300 central network controller hardware and handles most of the protocol described in chapter 8 of the USB specification. SIE 310 typically comprises signaling down to the transaction level. The functions it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection / generation, clock / data separation, data encoding / decoding non-inverted zero (NRZI) , generation and verification of CRC (token and data), generation and verification / decoding of packet ID (PID), and / or series-parallel / parallel-series conversion. The 310 receives a clock input 314 and is coupled to a suspend / resume logic circuit and frame timer 316 and a central circuit repeat loop 318 to control communication between the upstream USB transceiver port 302 and the transceiver ports Downstream USB 304, 306, 308 through the logic circuits of ports 320, 322, 324. The SIE 310 is coupled to a command decoder 326 through the logic interface to control the commands of a serial EEPROM via an EEPROM interface in series 330.
[0157] [0157] In several aspects, the USB 300 central network controller can connect 127 functions configured in up to six logical layers (levels) to a single computer. In addition, the USB 300 central network controller can connect all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power settings are bus-powered and self-powered modes. The USB 300 central network controller can be configured to support four power management modes: a bus-powered central controller with individual port power management or grouped port power management, and the self-powered central controller with power management. individual port power or grouped port power management. In one aspect, using a USB cable, the USB 300 central network controller, the USB upstream transceiver port 302 is plugged into a USB host controller, and the downstream USB transceiver ports 304, 306, 308 are exposed to connect compatible USB devices, and so on. Surgical instrument hardware
[0158] [0158] Figure 12 illustrates a logic diagram of a module of a 470 control system of a surgical instrument or tool, according to one or more aspects of the present description. The 470 system comprises a control circuit. The control circuit includes a microcontroller 461 comprising a processor 462 and a memory 468. One or more of the sensors 472, 474, 476, for example, provide real-time feedback to processor 462. A motor 482, driven by a driver of motor 492, operationally couples a longitudinally movable displacement member to drive the beam element with I-shaped beam. A tracking system 480 is configured to determine the position of the longitudinally movable displacement member. Position information is provided to the 462 processor, which can be programmed or configured to determine the position of the longitudinally movable drive member, as well as the position of a firing member, firing bar and cutting beam member with profiled beam. I. Additional motors can be provided at the instrument driver interface to control the firing of the I-profile beam, the displacement of the closing tube, the rotation of the drive shaft and the articulation. A 473 screen displays a variety of instrument operating conditions and can include touchscreen functionality for data entry. The information displayed on screen 473 can be overlaid with images captured using endoscopic imaging modules.
[0159] [0159] In one aspect, the 461 microcontroller can be any single-core or multi-core processor, such as those known under the ARM Cortex trade name available from Texas Instruments. In one respect, the main microcontroller 461 can be an LM4F230H5QR ARM Cortex-M4F processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single cycle flash memory, or other non-volatile memory, up to 40 MHz, a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWare® program, memory programmable and electrically erasable read-only (EEPROM) of 2 KB, one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, and / or one or more analog converters for 12 bit digital (ADC) with 12 channels of analog input, details of which are available for the product data sheet.
[0160] [0160] In one aspect, the 461 microcontroller can comprise a safety controller that comprises two families based on controllers, such as TMS570 and RM4x known under the trade name Hercules ARM Cortex R4, also available from Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options.
[0161] [0161] The 461 microcontroller can be programmed to perform various functions, such as precise control of the speed and position of the knife and joint systems. In one aspect, the microcontroller 461 includes a processor 462 and a memory 468. The electric motor 482 can be a brushed direct current (DC) motor with a gearbox and mechanical connections with an articulation or scalpel system. In one aspect, a motor drive 492 can be an A3941 available from Allegro Microsystems, Inc. Other motor drives can be readily replaced for use in tracking system 480 which comprises an absolute positioning system. A detailed description of an absolute positioning system is given in US Patent Application Publication No. 2017/0296213, entitled SYSTEMS AND METHODS FOR
[0162] [0162] The 461 microcontroller can be programmed to provide precise control of the speed and position of the displacement members and articulation systems. The 461 microcontroller can be configured to compute a response in the 461 microcontroller software. The computed response is compared to a measured response from the real system to obtain an "observed" response, which is used for actual feedback-based decisions. The observed response is a favorable and adjusted value, which balances the uniform and continuous nature of the simulated response with the measured response, which can detect external influences in the system.
[0163] [0163] In one aspect, motor 482 can be controlled by motor driver 492 and can be used by the instrument's trigger system or surgical tool. In many ways, the 482 motor can be a brushed direct current (DC) drive motor, with a maximum speed of approximately 25,000 RPM, for example. In other arrangements, the 482 motor may include a brushless motor, a wireless motor, a synchronous motor, a stepper motor or any other suitable type of electric motor. Motor starter 492 may comprise an H bridge starter comprising field effect transistors (FETs), for example. The 482 motor can be powered by a feed assembly releasably mounted on the handle assembly or tool compartment to provide control power for the instrument or surgical tool. The power pack may comprise a battery that may include several battery cells connected in series, which can be used as the power source to power the instrument or surgical tool. In certain circumstances, the battery cells in the power pack may be replaceable and / or rechargeable. In at least one example, the battery cells can be lithium-ion batteries that can be coupled and separable from the power pack.
[0164] [0164] The 492 motor driver can be an A3941, available from Allegro Microsystems, Inc. The A3941 492 driver is an entire bridge controller for use with external power semiconductor metal oxide field (MOSFET) transistors. , of N channel, specifically designed for inductive loads, such as brushed DC motors. The 492 actuator comprises a single charge pump regulator that provides full door drive (> 10 V) for batteries with voltage up to 7 V and allows the A3941 to operate with a reduced door drive, up to 5.5 V. A capacitor input control can be used to supply the voltage surpassing that supplied by the battery required for the N channel MOSFETs. An internal charge pump for the upper side drive allows operation in direct current (100% duty cycle). The entire bridge can be triggered in fast or slow drop modes using diodes or synchronized rectification. In the slow drop mode, the current can be recirculated by means of FET from the top or from the bottom. The energy FETs are protected from the shoot-through effect through programmable dead-time resistors. Integrated diagnostics provide indication of undervoltage, overtemperature and faults in the power bridge, and can be configured to protect power MOSFETs in most short-circuit conditions. Other motor drives can be readily replaced for use in the tracking system 480 comprising an absolute positioning system.
[0165] [0165] The tracking system 480 comprises a controlled motor drive circuit arrangement comprising a position sensor 472 in accordance with an aspect of the present description. The position sensor 472 of an absolute positioning system provides a unique position signal that corresponds to the location of a displacement member. In one aspect, the displacement member represents a longitudinally movable drive member comprising a rack of drive teeth for engagement with a corresponding drive gear of a gear reduction assembly.
[0166] [0166] The 482 electric motor may include a rotary drive shaft, which interfaces operationally with a gear set, which is mounted on a coupling hitch with a set or rack of drive teeth on the drive member. A sensor element can be operationally coupled to a gear assembly so that a single revolution of the position sensor element 472 corresponds to some linear longitudinal translation of the displacement member. An array of gears and sensors can be connected to the linear actuator by means of a rack and pinion arrangement, or by a rotary actuator, by means of a sprocket or other connection. A power supply supplies power to the absolute positioning system and an output indicator can display the output from the absolute positioning system. The drive member represents the longitudinally movable drive member comprising a rack of drive teeth formed thereon for engagement with a corresponding drive gear of the gear reducer assembly. The displacement member represents the longitudinally movable firing member, the firing bar, the I-beam or combinations thereof.
[0167] [0167] A single revolution of the sensor element associated with the position sensor 472 is equivalent to a longitudinal linear displacement d1 of the displacement member, where d1 is the longitudinal linear distance that the displacement member travels from point "a" to point " b "after a single revolution of the sensor element coupled to the displacement member. The sensor arrangement can be connected by means of a gear reduction which results in the position sensor 472 completing one or more revolutions for the complete travel of the displacement member. The 472 position sensor can complete multiple revolutions for the full travel of the displacement member.
[0168] [0168] A series of keys, where n is an integer greater than one, can be used alone or in combination with a gear reduction to provide a unique position signal for more than one revolution of the 472 position sensor. of the switches is transmitted back to microcontroller 461 which applies logic to determine a unique position signal corresponding to the longitudinal linear displacement of d1 + d2 +… dn of the displacement member. The output of the position sensor 472 is supplied to the microcontroller 461. In several embodiments, the position sensor 472 of the sensor arrangement may comprise a magnetic sensor, an analog rotary sensor, such as a potentiometer, or a series of analog Hall effect elements. , which emit a unique combination of position of signs or values.
[0169] [0169] The position sensor 472 can comprise any number of magnetic detection elements, such as, for example, magnetic sensors classified according to whether they measure the total magnetic field or the vector components of the magnetic field. The techniques used to produce both types of magnetic sensors cover many aspects of physics and electronics. Technologies used for magnetic field detection include flow meter, saturated flow, optical pumping, nuclear precession, SQUID, Hall effect, anisotropic magnetoresistance, giant magnetoresistance, magnetic tunnel junctions, giant magnetoimpedance, magnetostrictive / piezoelectric compounds, magnetodiode, magnetic transistor, fiber optics, magneto-optics and magnetic sensors based on microelectromechanical systems, among others.
[0170] [0170] In one aspect, the position sensor 472 for the tracking system 480 comprising an absolute positioning system comprises a magnetic rotating absolute positioning system. The 472 position sensor can be implemented as an AS5055EQFT magnetic single-circuit rotary position sensor, available from Austria Microsystems, AG. The position sensor 472 interfaces with the 461 microcontroller to provide an absolute positioning system. The 472 position sensor is a low voltage, low power component and includes four effect elements in an area of the 472 position sensor located above a magnet. A high-resolution ADC and an intelligent power management controller are also provided on the integrated circuit. A CORDIC (digital computer for coordinate rotation) processor, also known as the digit-for-digit method and Volder algorithm, is provided to implement a simple and efficient algorithm for calculating hyperbolic and trigonometric functions that require only addition, subtraction, displacement operations bits and lookup table. The angle position, alarm bits and magnetic field information are transmitted via a standard serial communication interface, such as a serial peripheral interface (SPI), to the 461 microcontroller. The 472 position sensor provides 12 or 14 bits of resolution. The 472 position sensor can be an AS5055 integrated circuit supplied in a small 16-pin QFN package measuring 4 x 4 x 0.85 mm.
[0171] [0171] The tracking system 480 comprising an absolute positioning system can comprise and / or be programmed to implement a feedback controller, such as a PID, status feedback, and adaptive controller. A power supply converts the signal from the feedback controller to a physical input to the system, in this case the voltage. Other examples include a voltage, current and force PWM. Other sensors can be provided in order to measure the parameters of the physical system in addition to the position measured by the position sensor 472. In some respects, the other sensors may include sensor arrangements as described in US patent No. 9,345,481 entitled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, granted on May 24, 2016, which is incorporated by reference in its entirety in this document; US Patent Application Serial No. 2014/0263552, entitled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, published on September 18, 2014, is incorporated by reference in its entirety into this document; and US Patent Application Serial No. 15 / 628,175, entitled TECHNIQUES FOR ADAPTIVE
[0172] [0172] The absolute positioning system provides an absolute positioning of the displaced member on the activation of the instrument without having to retract or advance the longitudinally movable driving member to the restart position (zero or initial), as may be required by the encoders conventional rotating machines that merely count the number of progressive or regressive steps that the 482 motor has traveled to infer the position of a device actuator, actuation bar, scalpel, and the like.
[0173] [0173] A 474 sensor, such as a strain gauge or a micro strain gauge, is configured to measure one or more parameters of the end actuator, such as the amplitude of the strain exerted on the anvil during a gripping operation, which may be indicative of tissue compression. The measured effort is converted into a digital signal and fed to the 462 processor. Alternatively, or in addition to the 474 sensor, a 476 sensor, such as a load sensor, can measure the closing force applied by the drive system. anvil closure. The 476 sensor, such as a load sensor, can measure the firing force applied to a beam with an I-profile in a firing stroke of the instrument or surgical tool. The I-profile beam is configured to engage a wedge slider, which is configured to move the clamp drivers upward to force the clamps to deform in contact with an anvil. The I-profile beam includes a sharp cutting edge that can be used to separate fabric, as the I-profile beam is advanced distally by the firing bar. Alternatively, a current sensor 478 can be used to measure the current drained by the 482 motor. The force required to advance the trigger member can correspond to the current drained by the 482 motor, for example. The measured force is converted into a digital signal and supplied to the 462 processor.
[0174] [0174] In one form, a 474 strain gauge sensor can be used to measure the force applied to the tissue by the end actuator. A strain gauge can be attached to the end actuator to measure the force applied to the tissue being treated by the end actuator. A system for measuring forces applied to the tissue attached by the end actuator comprises a 474 strain gauge sensor, such as, for example, a microstrain gauge, which is configured to measure one or more parameters of the end actuator, for example. In one aspect, the 474 strain gauge sensor can measure the amplitude or magnitude of the mechanical stress exerted on a claw member of an end actuator during a gripping operation, which can be indicative of tissue compression. The measured effort is converted into a digital signal and fed to the 462 processor of a microcontroller
[0175] [0175] Measurements of tissue compression, tissue thickness and / or force required to close the end actuator on the tissue, as measured by sensors 474, 476, can be used by microcontroller 461 to characterize the selected position of the trigger member and / or the corresponding trigger member speed value. In one case, a 468 memory can store a technique, an equation and / or a look-up table that can be used by the 461 microcontroller in the evaluation.
[0176] [0176] The 470 control system of the instrument or surgical tool can also comprise wired or wireless communication circuits for communication with the modular communication center shown in Figures 8 to 11.
[0177] [0177] Figure 13 illustrates a control circuit 500 configured to control aspects of the instrument or surgical tool according to an aspect of the present description. The control circuit 500 can be configured to implement various processes described herein. The control circuit 500 may comprise a microcontroller comprising one or more processors 502 (for example, microprocessor, microcontroller) coupled to at least one memory circuit 504. The memory circuit 504 stores instructions executable on a machine that, when executed by the processor 502, cause the 502 processor to execute machine instructions to implement several of the processes described here. The 502 processor can be any one of a number of single-core or multi-core processors known in the art. The memory circuit 504 may comprise volatile and non-volatile storage media. The processor 502 can include an instruction processing unit 506 and an arithmetic unit 508. The instruction processing unit can be configured to receive instructions from the memory circuit 504 of the present description.
[0178] [0178] Figure 14 illustrates a combinational logic circuit 510 configured to control aspects of the instrument or surgical tool according to an aspect of the present description. The combinational logic circuit 510 can be configured to implement various processes described herein. The combinational logic circuit 510 may comprise a finite state machine comprising a combinational logic 512 configured to receive data associated with the surgical instrument or tool at an input 514, process the data by combinational logic 512 and provide an output 516.
[0179] [0179] Figure 15 illustrates a sequential logic circuit 520 configured to control aspects of the surgical instrument or tool according to an aspect of the present description. Sequential logic circuit 520 or combinational logic 522 can be configured to implement the process described herein. Sequential logic circuit 520 may comprise a finite state machine. Sequential logic circuit 520 may comprise combinational logic 522, at least one memory circuit 524, a clock 529 and, for example. The at least one memory circuit 524 can store a current state of the finite state machine. In certain cases, the sequential logic circuit 520 may be synchronous or asynchronous. Combinational logic 522 is configured to receive data associated with the surgical instrument or tool from an input 526, process the data by combinational logic 522, and provide an output 528. In other respects, the circuit may comprise a combination of a processor (for example , processor 502, Figure 13) and a finite state machine for implementing various processes of the present invention. In other respects, the finite state machine may comprise a combination of a combinational logic circuit (for example, a combinational logic circuit 510, Figure 14) and the sequential logic circuit 520.
[0180] [0180] Figure 16 illustrates an instrument or surgical tool that comprises a plurality of motors that can be activated to perform various functions. In certain cases, a first engine can be activated to perform a first function, a second engine can be activated to perform a second function, a third engine can be activated to perform a third function, a fourth engine can be activated to perform a fourth function, and so on. In certain cases, the plurality of motors of the robotic surgical instrument 600 can be individually activated to cause firing, closing, and / or articulation movements in the end actuator. The firing, closing and / or articulation movements can be transmitted to the end actuator through a drive shaft assembly, for example.
[0181] [0181] In certain cases, the instrument or surgical tool system may include a 602 firing motor. The 602 firing motor can be operationally coupled to a 604 firing motor drive assembly, which can be configured to transmit movement from trigger, generated by the motor 602, to the end actuator, in particular, to move the beam element with I-shaped profile. In certain cases, the trigger movements generated by the trigger motor 602 can cause the clamps to be implanted from of the staple cartridge in the fabric captured by the end actuator and / or the cutting edge of the I-beam beam element to be advanced in order to cut the captured fabric, for example. The I-beam member can be retracted by reversing the direction of the 602 motor.
[0182] [0182] In certain cases, the surgical instrument or tool may include a closing motor 603. The closing motor 603 can be operationally coupled to a drive assembly of the closing motor 605 that can be configured to transmit closing movements generated by the motor 603 to the end actuator, particularly to move a closing tube to close the anvil and compress the fabric between the anvil and the staple cartridge. Closing movements can cause the end actuator to transition from an open configuration to an approximate configuration to capture tissue, for example. The end actuator can be moved to an open position by reversing the direction of the 603 motor.
[0183] [0183] In certain cases, the surgical instrument or tool may include one or more articulation motors 606a, 606b, for example. The motors 606a, 606b can be operationally coupled to the drive assemblies of the articulation motor 608a, 608b, which can be configured to transmit articulation movements generated by the motors 606a, 606b to the end actuator. In certain cases, the articulation movements can cause the end actuator to be articulated in relation to the drive shaft assembly, for example.
[0184] [0184] As described above, the surgical instrument or tool can include a plurality of motors that can be configured to perform various independent functions. In certain cases, the plurality of motors of the instrument or surgical tool can be activated individually or separately to perform one or more functions, while other motors remain inactive. For example, the articulation motors 606a, 606b can be activated to cause the end actuator to be articulated, while the firing motor 602 remains inactive. Alternatively, the firing motor 602 can be activated to fire the plurality of clamps, and / or advance the cutting edge, while the hinge motor 606 remains inactive. In addition, the closing motor 603 can be activated simultaneously with the firing motor 602 to cause the closing tube or I-beam beam element to move distally, as described in more detail later in this document.
[0185] [0185] In certain cases, the surgical instrument or tool may include a common 610 control module that can be used with a plurality of the instrument's instruments or surgical tool. In certain cases, the common control module 610 can accommodate one of the plurality of motors at a time. For example, the common control module 610 can be coupled to and separable from the plurality of motors of the robotic surgical instrument individually. In certain cases, a plurality of surgical instrument or tool motors may share one or more common control modules, such as the common control module 610. In certain cases, a plurality of surgical instrument or tool motors may be individually and selectively engaged to the common control module 610. In certain cases, the common control module 610 can be selectively switched between interfacing with one of a plurality of instrument motors or surgical tool to interface with another among the plurality of instrument motors or surgical tool.
[0186] [0186] In at least one example, the common control module 610 can be selectively switched between the operational coupling with the 606a, 606b articulation motors, and the operational coupling with the 602 firing motor or the 603 closing motor. at least one example, as shown in Figure 16, a key 614 can be moved or transitioned between a plurality of positions and / or states. In the first position 616, the switch 614 can electrically couple the common control module 610 to the trip motor 602; in a second position 617, the switch 614 can electrically couple the control module 610 to the closing motor 603; in a third position 618a, the switch 614 can electrically couple the common control module 610 to the first articulation motor 606a; and in a fourth position 618b, the switch 614 can electrically couple the common control module 610 to the second articulation motor 606b, for example. In certain cases, separate common control modules 610 can be electrically coupled to the firing motor 602, closing motor 603, and hinge motors 606a, 606b at the same time. In certain cases, key 614 can be a mechanical key, an electromechanical key, a solid state key, or any suitable switching mechanism.
[0187] [0187] Each of the 602, 603, 606a, 606b motors can comprise a torque sensor to measure the output torque on the motor drive shaft. The force on an end actuator can be detected in any conventional manner, such as by means of force sensors on the outer sides of the jaws or by a motor torque sensor that drives the jaws.
[0188] [0188] In several cases, as shown in Figure 16, the common control module 610 may comprise a motor starter 626 that may comprise one or more H-Bridge FETs. The motor driver 626 can modulate the energy transmitted from a power source 628 to a motor coupled to the common control module 610, based on an input from a microcontroller 620 (the "controller"), for example. In certain cases, the microcontroller 620 can be used to determine the current drawn by the motor, for example, while the motor is coupled to the common control module 610, as described above.
[0189] [0189] In certain examples, the microcontroller 620 may include a microprocessor 622 (the "processor") and one or more non-transitory computer-readable media or 624 memory units (the "memory"). In certain cases, memory 624 can store various program instructions which, when executed, can cause processor 622 to perform a plurality of functions and / or calculations described herein. In certain cases, one or more of the memory units 624 can be coupled to the processor 622, for example.
[0190] [0190] In certain cases, the power supply 628 can be used to supply power to the microcontroller 620, for example. In certain cases, the 628 power source may comprise a battery (or "battery pack" or "power source"), such as a Li ion battery, for example. In certain cases, the battery pack can be configured to be releasably mounted to the handle to supply power to the surgical instrument 600. Several battery cells connected in series can be used as the 628 power source. In certain cases, the power source 628 power supply can be replaceable and / or rechargeable, for example.
[0191] [0191] In several cases, the 622 processor can control the motor drive 626 to control the position, direction of rotation and / or speed of a motor that is coupled to the common control module 610. In certain cases, the processor 622 can signal the motor driver 626 to stop and / or disable a motor that is coupled to the common control module 610. It should be understood that the term "processor", as used here, includes any microprocessor, microcontroller or other control device. adequate basic computing that incorporates the functions of a central computer processing unit (CPU) in an integrated circuit or, at most, some integrated circuits. The processor is a programmable multipurpose device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. This is an example of sequential digital logic, as it has internal memory. Processors operate on numbers and symbols represented in the binary numeral system.
[0192] [0192] In one example, the 622 processor can be any single-core or multi-core processor, such as those known by the Texas Instruments ARM Cortex trade name. In certain cases, the 620 microcontroller may be an LM 4F230H5QR, available from Texas Instruments, for example. In at least one example, the Texas Instruments LM4F230H5QR is an ARM Cortex-M4F processor core that comprises a 256 KB single cycle flash integrated memory, or other non-volatile memory, up to 40 MHz, an early seek buffer for optimize performance above 40 MHz, a 32 KB single cycle SRAM, an internal ROM loaded with StellarisWare® software, 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, one or more ADCs 12-bit with 12 channels of analog input, among other features that are readily available for the product data sheet. Other microcontrollers can be readily replaced for use with the 4410 module. Consequently, the present description should not be limited in this context.
[0193] [0193] In certain cases, memory 624 may include program instructions for controlling each of the motors of the surgical instrument 600 that are attachable to common control module 610. For example, memory 624 may include program instructions for controlling the motor trigger 602, closing motor 603 and hinge motors 606a, 606b. Such program instructions can cause the 622 processor to control the trigger, close, and link functions according to inputs from the instrument or surgical tool control algorithms or programs.
[0194] [0194] In certain cases, one or more mechanisms and / or sensors, such as 630 sensors, can be used to alert the 622 processor to program instructions that need to be used in a specific configuration. For example, sensors 630 can alert the 622 processor to use the program instructions associated with triggering, closing, and pivoting the end actuator. In certain cases, sensors 630 may comprise position sensors that can be used to detect the position of switch 614, for example. Consequently, the processor 622 can use the program instructions associated with the firing of the I-beam of the end actuator by detecting, through sensors 630, for example, that the switch 614 is in the first position 616; the processor 622 can use the program instructions associated with closing the anvil upon detection through sensors 630, for example, that switch 614 is in second position 617; and processor 622 can use the program instructions associated with the articulation of the end actuator upon detection through sensors 630, for example, that switch 614 is in the third or fourth position 618a, 618b.
[0195] [0195] Figure 17 is a schematic diagram of a robotic surgical instrument 700 configured to operate a surgical tool described in this document, in accordance with an aspect of that description. The robotic surgical instrument 700 can be programmed or configured to control the distal / proximal translation of a displacement member, the distal / proximal displacement of a closing tube, the rotation of the drive shaft, and articulation, either with a single type or multiple articulation drive links. In one aspect, the surgical instrument 700 can be programmed or configured to individually control a firing member, a closing member, a driving shaft member and / or one or more hinge members. The surgical instrument 700 comprises a control circuit 710 configured to control motor driven trigger members, closing members,
[0196] [0196] In one aspect, the robotic surgical instrument 700 comprises a control circuit 710 configured to control an anvil 716 and a beam portion with I-shaped profile 714 (including a sharp cutting edge) of an end actuator 702, a cartridge of removable clamps 718, a drive shaft 740 and one or more hinge members 742a, 742b through a plurality of motors 704a to 704e. A position sensor 734 can be configured to provide position feedback on the I-profile beam 714 to control circuit 710. Other sensors 738 can be configured to provide feedback to control circuit 710. A timer / counter 731 provides timing and counting to control circuit 710. A power source 712 can be provided to operate motors 704a to 704e and a current sensor 736 provides motor current feedback to control circuit 710. Motors 704a to 704e can be operated individually by control circuit 710 in an open loop or closed loop feedback control.
[0197] [0197] In one aspect, the control circuit 710 may comprise one or more microcontrollers, microprocessors or other processors suitable for executing instructions that cause the processor or processors to perform one or more tasks. In one aspect, a timer / counter 731 provides an output signal, such as elapsed time or a digital count, to control circuit 710 to correlate beam position with I-shaped profile 714, as determined by position sensor 734, with the timer / counter output 731, so that the control circuit 710 can determine the position of the I-profile beam 714 at a specific time (t) in relation to an initial position or time (t)
[0198] [0198] In one aspect, control circuit 710 can be programmed to control functions of end actuator 702 based on one or more tissue conditions. Control circuit 710 can be programmed to directly or indirectly detect tissue conditions, such as thickness, as described here. Control circuit 710 can be programmed to select a trigger control program or closing control program based on tissue conditions. A trigger control program can describe the distal movement of the displacement member. Different trigger control programs can be selected to better treat different tissue conditions. For example, when thicker tissue is present, control circuit 710 can be programmed to translate the displacement member at a lower speed and / or with a lower power. When a thinner fabric is present, the control circuit 710 can be programmed to move the displacement member at a higher speed and / or with greater power. A closing control program can control the closing force applied to the tissue by the anvil 716. Other control programs control the rotation of the drive shaft 740 and the hinge members 742a, 742b.
[0199] [0199] In one aspect, the 710 motor control circuit can generate motor setpoint signals. Motor setpoint signals can be supplied to several motor controllers 708a through 708e. Motor controllers 708a through 708e may comprise one or more circuits configured to provide motor drive signals to motors 704a through 704e in order to drive motors 704a through
[0200] [0200] In one aspect, the control circuit 710 can initially operate each of the motors 704a to 704e in an open circuit configuration for a first open circuit portion of the travel of the displacement member. Based on the response of the robotic surgical instrument 700 during the open circuit portion of the stroke, control circuit 710 can select a trigger control program in a closed circuit configuration. The instrument response may include a translation of the distance of the displacement member during the open circuit portion, a time elapsed during the open circuit portion, the energy supplied to one of the motors 704a to 704e during the open circuit portion, a sum pulse widths of a motor start signal, etc. After the open circuit portion, control circuit 710 can implement the selected trigger control program for a second portion of the travel member travel. For example, during a portion of the closed loop course, control circuit 710 can modulate one of the motors 704a to 704e based on the translation of data describing a position of the closed displacement member to translate the displacement member to a constant speed.
[0201] [0201] In one aspect, motors 704a to 704e can receive power from a 712 power source. Power source 712 can be a DC power source powered by an alternating main power source, a battery, a supercapacitor , or any other suitable energy source. Motors 704a to 704e can be mechanically coupled to individual moving mechanical elements, such as the I-profile beam 714, the anvil 716, the drive shaft 740, the joint 742a and the joint 742b, by means of the respective transmissions 706a to 706e . Transmissions 706a through 706e may include one or more gears or other connecting components for coupling motors 704a to 704e to moving mechanical elements. A position sensor 734 can detect a position of the beam with an I-profile 714. The position sensor 734 can be or can include any type of sensor that is capable of generating position data that indicate a position of the beam with an I-profile 714 In some examples, the position sensor 734 may include an encoder configured to supply a series of pulses to the control circuit 710 according to the beam with I-profile 714 translated distally and proximally. Control circuit 710 can track pulses to determine the position of the I-profile beam 714. Other suitable position sensors can be used, including, for example, a proximity sensor. Other types of position sensors can provide other signals that indicate the movement of the I-profile beam
[0202] [0202] In one aspect, control circuit 710 is configured to drive a firing member like the portion of the I-profile beam 714 of end actuator 702. Control circuit 710 provides a motor setpoint for a motor control 708a, which provides a drive signal to motor 704a. The output shaft of the motor 704a is coupled to a torque sensor 744a. The torque sensor 744a is coupled to a transmission 706a which is coupled to the I-profile beam 714. The transmission 706a comprises moving mechanical elements, such as rotating elements, and a firing member to control the movement of the beam beam distally and proximally. in I 714 along a longitudinal geometric axis of end actuator 702. In one aspect, motor 704a can be coupled to the knife gear assembly, which includes a knife gear reduction assembly that includes a first drive gear and a second knife drive gear. A torque sensor 744a provides a trigger force feedback signal to control circuit 710. The trigger force signal represents the force required to fire or move the I-profile beam 714. A 734 position sensor can be configured to provide the position of the I-beam beam 714 along the firing stroke or the firing member position as a feedback signal to control circuit 710. End actuator 702 may include additional sensors 738 configured to provide signals feedback to control circuit 710. When ready for use, control circuit 710 can provide a trip signal to the 708a motor control. In response to the trigger signal, motor 704a can drive the trigger member distally along the longitudinal geometry axis of end actuator 702 from an initial proximal position of the stroke to an end distal position of the stroke relative to the initial position of course. As the displacement member moves distally, an I-beam 714, with a cutting element positioned at a distal end, advances distally to cut the fabric between the staple cartridge 718 and the anvil 716.
[0203] [0203] In one aspect, control circuit 710 is configured to drive a closing member, such as anvil portion 716 of end actuator 702. Control circuit 710 provides a motor setpoint to a motor control 708b, which provides a drive signal to motor 704b. The output shaft of the 704b motor is coupled to a 744b torque sensor. The torque sensor 744b is coupled to a transmission 706b which is coupled to the anvil 716. The transmission 706b comprises moving mechanical elements, such as rotating elements and a closing member, to control the movement of the anvil 716 between the open and closed positions. In one aspect, the 704b motor is coupled to a closing gear assembly, which includes a closing reduction gear assembly that is supported in gear engaged with the closing sprocket. The torque sensor 744b provides a closing force feedback signal for control circuit 710. The closing force feedback signal represents the closing force applied to the anvil 716. The position sensor 734 can be configured to provide the position of the closing member as a feedback signal for control circuit 710. Additional sensors 738 on end actuator 702 can provide the feedback signal for closing force to control circuit 710. The pivoting anvil 716 is positioned opposite the staple cartridge 718. When ready for use, control circuit 710 can provide a closing signal to motor control 708b. In response to the closing signal, motor 704b advances a closing member to secure the fabric between the anvil 716 and the staple cartridge 718.
[0204] [0204] In one aspect, control circuit 710 is configured to rotate a drive shaft member, such as drive shaft 740, to rotate end actuator 702. Control circuit 710 provides a motor setpoint to a motor control 708c, which provides a drive signal to the motor 704c. The output shaft of the motor 704c is coupled to a torque sensor 744c. The torque sensor 744c is coupled to a transmission 706c which is coupled to the shaft 740. The transmission 706c comprises moving mechanical elements, such as rotary elements, to control the rotation of the drive shaft 740 clockwise or counterclockwise until and above 360 °. In one aspect, the 704c engine is coupled to the rotary drive assembly, which includes a pipe gear segment that is formed over (or attached to) the proximal end of the proximal closing tube for operable engagement by a rotational gear assembly that is supported operationally on the tool mounting plate. The torque sensor 744c provides a rotation force feedback signal for control circuit 710. The rotation force feedback signal represents the rotation force applied to the drive shaft 740. The position sensor 734 can be configured to provide the position of the closing member as a feedback signal to the control circuit 710. Additional sensors 738, such as a drive shaft encoder, can provide the rotational position of the drive shaft 740 to the control circuit 710.
[0205] [0205] In one aspect, control circuit 710 is configured to link end actuator 702. Control circuit 710 provides a motor setpoint to a 708d motor control, which provides a drive signal to motor 704d . The output shaft of the 704d motor is coupled to a 744d torque sensor. The torque sensor 744d is coupled to a transmission 706d which is coupled to a pivot member 742a. The 706d transmission comprises moving mechanical elements, such as pivoting elements, to control the articulation of the 702 ± 65 ° end actuator. In one aspect, the 704d motor is coupled to a pivot nut, which is rotatably seated on the proximal end portion of the distal column portion and is pivotally driven thereon by a pivot gear assembly. The torque sensor 744d provides a hinge force feedback signal to control circuit 710. The hinge force feedback signal represents the hinge force applied to the end actuator 702. The 738 sensors, as a hinge encoder , can provide the pivoting position of end actuator 702 for control circuit 710.
[0206] [0206] In another aspect, the articulation function of the robotic surgical system 700 may comprise two articulation members, or connections, 742a, 742b. These hinge members 742a, 742b are driven by separate disks at the robot interface (the rack), which are driven by the two motors 708d, 708e. When the separate firing motor 704a is provided, each hinge link 742a, 742b can be antagonistically driven with respect to the other link to provide a resistive holding movement and a load to the head when it is not moving and to provide a movement of articulation when the head is articulated. The hinge members 742a, 742b attach to the head in a fixed radius when the head is rotated. Consequently, the mechanical advantage of the push and pull link changes when the head is rotated. This change in mechanical advantage can be more pronounced with other drive systems for the articulation connection.
[0207] [0207] In one aspect, the one or more motors 704a to 704e may comprise a brushed DC motor with a gearbox and mechanical connections to a firing member, closing member or articulation member. Another example includes electric motors 704a to 704e that operate the moving mechanical elements such as the displacement member, the articulation connections, the closing tube and the drive shaft. An external influence is an excessive and unpredictable influence on things like tissue, surrounding bodies, and friction in the physical system. This external influence can be called drag, which acts in opposition to one of the electric motors 704a to 704e. External influence, such as drag, can cause the functioning of the physical system to deviate from a desired operation of the physical system.
[0208] [0208] In one aspect, the position sensor 734 can be implemented as an absolute positioning system. In one aspect, the position sensor 734 can comprise an absolute rotary magnetic positioning system implemented as a single integrated circuit rotary magnetic position sensor AS5055EQFT, available from Austria Microsystems, AG. The position sensor 734 can interface with the control circuit 710 to provide an absolute positioning system. The position can include multiple Hall effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit by digit method and Volder algorithm, which is provided to implement a simple and efficient algorithm for calculating hyperbolic and trigonometric functions which only require addition, subtraction, bit shift and lookup table operations.
[0209] [0209] In one aspect, control circuit 710 can be in communication with one or more sensors 738. Sensors 738 can be positioned on end actuator 702 and adapted to work with the robotic surgical instrument 700 to measure various derived parameters such as span distance in relation to time, compression of the tissue in relation to time, and deformation of the anvil in relation to time. The 738 sensors can comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a load cell, a pressure sensor, a force sensor, a torque sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor and / or any other sensor suitable for measuring one or more parameters of end actuator 702. Sensors 738 may include one or more sensors. Sensors 738 may be located on the staple cartridge platform 718 to determine the location of the tissue using segmented electrodes. The torque sensors 744a to 744e can be configured to detect force such as firing force, closing force, and / or articulation force, among others. Consequently, control circuit 710 can detect (1) the closing load experienced by the distal closing tube and its position, (2) the trigger member on the rack and its position, (3) which portion of the staple cartridge 718 has tissue in it and (4) the load and position on both articulation rods.
[0210] [0210] In one aspect, the one or more sensors 738 may comprise a stress meter such as, for example, a microstrain meter, configured to measure the magnitude of the stress on the anvil 716 during a clamped condition. The voltage meter provides an electrical signal whose amplitude varies with the magnitude of the voltage. Sensors 738 can comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 716 and the staple cartridge 718. Sensors 738 can be configured to detect the impedance of a section of tissue located between the anvil 716 and the staple cartridge 718 which is indicative of the thickness and / or completeness of the fabric located between them.
[0211] [0211] In one aspect, the 738 sensors can be implemented as one or more limit switches, electromechanical devices, solid state switches, Hall effect devices, magneto-resistive devices (MR) giant magneto-resistive devices (GMR), magnetometers, among others. In other implementations, the 738 sensors can be implemented as solid state switches that operate under the influence of light, such as optical sensors, infrared sensors, ultraviolet sensors, among others. In addition, the switches can be solid state devices such as transistors (for example, FET, junction FET, MOSFET, bipolar, and the like). In other implementations, the 738 sensors can include driverless electric switches, ultrasonic switches, accelerometers and inertia sensors, among others.
[0212] [0212] In one aspect, sensors 738 can be configured to measure the forces exerted on the anvil 716 by the closing drive system. For example, one or more sensors 738 may be at a point of interaction between the closing tube and the anvil 716 to detect the closing forces applied by the closing tube to the anvil 716. The forces exerted on the anvil 716 may be representative of the tissue compression experienced by the tissue section captured between the anvil 716 and the staple cartridge 718. The one or more sensors 738 can be positioned at various points of interaction throughout the closing drive system to detect the closing forces applied to the anvil 716 by the closing drive system. The one or more sensors 738 can be sampled in real time during a gripping operation by the processor of the control circuit 710. The control circuit 710 receives sample measurements in real time to provide and analyze information based on time and evaluate, in real time, the closing forces applied to the anvil 716.
[0213] [0213] In one aspect, a current sensor 736 can be used to measure the current drawn by each of the 704a to 704e motors. The force required to advance any of the moving mechanical elements, such as the beam with I-714 profile, corresponds to the current drained by one of the motors 704a to 704e. The force is converted into a digital signal and supplied to control circuit 710. Control circuit 710 can be configured to simulate the response of the instrument's actual system in the controller software. A displacement member can be actuated to move an I-beam beam 714 on end actuator 702 at or near a target speed. The robotic surgical instrument 700 may include a feedback controller, which may be one or any of the feedback controllers, including, but not limited to, a PID controller, state feedback, linear quadratic (LQR) and / or an adaptive controller , for example. The robotic surgical instrument 700 can include a power source to convert the signal from the feedback controller to a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque and / or force, for example. Additional details are disclosed in US Patent Application Serial No. 15 / 636,829, entitled CLOSED LOOP VELOCITY CONTROL TECHNIQUES FOR ROBOTIC SURGICAL INSTRUMENT, filed June 29, 2017, which is hereby incorporated by reference in its entirety.
[0214] [0214] Figure 18 illustrates a block diagram of a surgical instrument 750 programmed to control the distal translation of a displacement member according to an aspect of the present description. In one aspect, the surgical instrument 750 is programmed to control the distal translation of a displacement member, such as the I-profile beam 764. The surgical instrument 750 comprises an end actuator 752 that can comprise an anvil 766, a beam with I-shaped profile 764 (including a sharp cutting edge) and a removable staple cartridge 768.
[0215] [0215] The position, movement, displacement and / or translation of a linear displacement member, such as the beam with I-764 profile, can be measured by an absolute positioning system, a sensor arrangement and a position sensor 784. Since the I-beam beam 764 is coupled to a longitudinally movable drive member, the position of the I-beam beam 764 can be determined by measuring the position of the longitudinally mobile drive member employing the 784 position sensor Consequently, in the following description, the position, displacement and / or translation of the I-profile beam 764 can be obtained by the position sensor 784, as described in the present invention. A control circuit 760 can be programmed to control the translation of the displacement member, such as the I-profile beam 764. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors or other suitable processors the instructions that cause the processor or processors to control the displacement member, for example, the I 764 profile beam, as described. In one aspect, a timer / counter 781 provides an output signal, such as elapsed time or a digital count, to the control circuit 760 to correlate the beam position with I-shaped profile 764, as determined by the position sensor 784 with the timer / counter output 781, so that the control circuit 760 can determine the position of the I-profile beam 764 at a specific time (t) in relation to an initial position. The 781 timer / counter can be configured to measure elapsed time, count external events, or measure timeless events.
[0216] [0216] Control circuit 760 can generate a 772 motor setpoint signal. The 772 motor setpoint signal can be supplied to a 758 motor controller. The 758 motor controller can comprise one or more circuits configured to provide a motor 774 drive signal to motor 754 to drive motor 754, as described in the present invention. In some instances, the 754 motor may be a DC motor with a brushed DC electric motor. For example, the speed of motor 754 can be proportional to the drive signal of motor 774. In some instances, motor 754 can be a brushless DC electric motor and the motor drive signal 774 can comprise a PWM signal provided for a or more motor stator windings 754. In addition, in some examples, motor controller 758 may be omitted, and control circuit 760 can generate motor drive signal 774 directly.
[0217] [0217] The 754 motor can receive power from a power source
[0218] [0218] The control circuit 760 can be in communication with one or more sensors 788. The sensors 788 can be positioned on the end actuator 752 and adapted to work with the surgical instrument 750 to measure the various derived parameters, such as span distance in relation to time, compression of the tissue in relation to time and mechanical tension of the anvil in relation to time. The 788 sensors can comprise a magnetic sensor, a magnetic field sensor, a stress meter, a pressure sensor, a force sensor, an inductive sensor such as a eddy current sensor, a resistive sensor, a capacitive sensor, a sensor optical and / or any other sensors suitable for measuring one or more parameters of the 752 end actuator. The 788 sensors may include one or more sensors.
[0219] [0219] The one or more sensors 788 may comprise an effort meter, such as a microstrain meter, configured to measure the magnitude of the mechanical stress on the anvil 766 during a grip condition. The voltage meter provides an electrical signal whose amplitude varies with the magnitude of the voltage. The 788 sensors can comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The 788 sensors can be configured to detect the impedance of a section of tissue located between the anvil 766 and the staple cartridge 768 which is indicative of the thickness and / or completeness of the fabric located between them.
[0220] [0220] The 788 sensors can be configured to measure the forces exerted on the anvil 766 by a closing drive system. For example, one or more sensors 788 can be at a point of interaction between a closing tube and anvil 766 to detect the closing forces applied by a closing tube to anvil 766. The forces exerted on anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various points of interaction throughout the closing drive system to detect the closing forces applied anvil 766 by the closing drive system. The one or more 788 sensors can be sampled in real time during a gripping operation by a processor from the control circuit 760. The control circuit 760 receives sample measurements in real time to provide and analyze information based on time and evaluate, in real time, the closing forces applied to the anvil 766.
[0221] [0221] A current sensor 786 can be used to measure the current drained by the 754 motor. The force required to advance the beam with I-shaped profile 764 corresponds to the current drained by the motor
[0222] [0222] Control circuit 760 can be configured to simulate the actual system response of the instrument in the controller software. A displacement member can be actuated to move a beam with I-profile 764 on end actuator 752 at or near a target speed. The surgical instrument 750 may include a feedback controller, which can be one or any of the feedback controllers, including, but not limited to, a PID controller, state feedback, LQR, and / or an adaptive controller, for example. The surgical instrument 750 can include a power source to convert the signal from the feedback controller to a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque and / or force, for example.
[0223] [0223] The actual drive system of the surgical instrument 750 is configured to drive the displacement member, cutting member or beam with I-764 profile, through a brushed DC motor with gearbox and mechanical connections with a system joint and / or knife. Another example is the 754 electric motor that operates the displacement member and the articulation drive, for example, from an interchangeable drive shaft assembly. An external influence is an excessive and unpredictable influence on things like tissue, surrounding bodies, and friction in the physical system. This external influence can be called drag, which acts in opposition to the 754 electric motor. External influence, like drag, can cause the functioning of the physical system to deviate from a desired operation of the physical system.
[0224] [0224] Several exemplifying aspects are directed to a 750 surgical instrument that comprises a 752 end actuator with motor-driven surgical stapling and cutting implements. For example, a motor 754 can drive a displacement member distally and proximally along a longitudinal geometry axis of end actuator 752. End actuator 752 may comprise an articulating anvil 766 and, when configured for use, a staples 768 positioned opposite the anvil 766. A doctor can hold the tissue between the anvil 766 and the staple cartridge 768, as described in the present invention. When ready to use the 750 instrument, the physician can provide a trigger signal, for example, by pressing a trigger on the 750 instrument. In response to the trigger signal, motor 754 can drive the displacement member distally along the longitudinal geometric axis of the end actuator 752 from a proximal start position to an end position distal from the start position. As the displacement member moves distally, an I-beam 764 with a cutting element positioned at a distal end can cut the fabric between the staple cartridge 768 and the anvil 766.
[0225] [0225] In several examples, the surgical instrument 750 may comprise a control circuit 760 programmed to control the distal translation of the displacement member, such as the I-profile beam 764, for example, based on one or more tissue conditions . The control circuit 760 can be programmed to directly or indirectly detect tissue conditions, such as thickness, as described here. Control circuit 760 can be programmed to select a trigger control program based on tissue conditions. A trigger control program can describe the distal movement of the displacement member. Different trigger control programs can be selected to better treat different tissue conditions. For example, when a thicker tissue is present, control circuit 760 can be programmed to translate the displacement member at a lower speed and / or with a lower power. When a thinner fabric is present, the control circuit 760 can be programmed to move the displacement member at a higher speed and / or with greater power.
[0226] [0226] In some examples, control circuit 760 may initially operate motor 754 in an open circuit configuration for a first open circuit portion of a travel of the displacement member. Based on an instrument response 750 during the open circuit portion of the course, control circuit 760 can select a trip control program. The response of the instrument may include a travel distance of the displacement member during the open circuit portion, a time elapsed during the open circuit portion, the power supplied to the motor 754 during the open circuit portion, a sum of pulse widths a motor start signal, etc. After the open circuit portion, control circuit 760 can implement the selected trigger control program for a second portion of the travel member travel. For example, during the closed loop portion of the stroke, control circuit 760 can modulate motor 754 based on translation data that describes a position of the displacement member in a closed circuit manner to translate the displacement member into a constant speed. Additional details are disclosed in US Patent Application Serial No. 15 / 720,852, entitled SYSTEM AND METHODS FOR CONTROLLING A DISPLAY OF A SURGICAL INSTRUMENT, filed on September 29, 2017, which is hereby incorporated by reference in its entirety.
[0227] [0227] Figure 19 is a schematic diagram of a 790 surgical instrument configured to control various functions in accordance with an aspect of the present description. In one aspect, the surgical instrument 790 is programmed to control the distal translation of a displacement member, such as the I-profile beam 764. The surgical instrument 790 comprises an end actuator 792 that can comprise an anvil 766, a beam with I-profile 764 and a removable staple cartridge 768 that can be interchanged with an RF cartridge 796 (shown in dashed line).
[0228] [0228] In one aspect, the 788 sensors can be implemented as a limit switch, electromechanical device, solid state switches, Hall effect devices, MRI devices, GMR devices, magnetometers, among others. In other implementations, 638 sensors can be solid state switches that operate under the influence of light, such as optical sensors, infrared sensors, ultraviolet sensors, among others. In addition, the switches can be solid state devices such as transistors (for example, FET, junction FET, MOSFET, bipolar, and the like). In other implementations, 788 sensors can include driverless electric switches, ultrasonic switches, accelerometers, inertia sensors, and more.
[0229] [0229] In one aspect, the position sensor 784 can be implemented as an absolute positioning system, which comprises a rotary magnetic absolute positioning system implemented as a single integrated circuit rotary magnetic position sensor AS5055EQFT, available from Austria Microsystems , AG. The position sensor 784 can interface with the control circuit 760 to provide an absolute positioning system. The position can include multiple Hall effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit by digit method and Volder algorithm, which is provided to implement a simple and efficient algorithm for calculating hyperbolic and trigonometric functions which only require addition, subtraction, bit shift and lookup table operations.
[0230] [0230] In one aspect, the I 764 profile beam can be implemented as a knife member comprising a knife body that operationally supports a fabric cutting blade therein and may additionally include anvil hinges or flaps. and channel hitch or pedal features. In one aspect, the staple cartridge 768 can be implemented as a standard (mechanical) surgical clamp cartridge. In one aspect, the RF cartridge 796 can be implemented as an RF cartridge. These and other sensor provisions are described in Commonly Owned US Patent Application Serial No. 15 / 628,175, entitled TECHNIQUES
[0231] [0231] The position, movement, displacement and / or translation of a linear displacement member, such as the beam with I-764 profile, can be measured by an absolute positioning system, a sensor arrangement and a sensor of position represented as the 784 position sensor. As the I-beam beam 764 is coupled to the longitudinally movable drive member, the position of the I-beam beam 764 can be determined by measuring the position of the longitudinally mobile drive member that it employs the position sensor 784. Consequently, in the following description, the position, displacement and / or translation of the I-profile beam 764 can be obtained by the position sensor 784, as described in the present invention. A control circuit 760 can be programmed to control the translation of the displacement member, such as the I-profile beam 764, as described here. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors or other suitable processors to execute the instructions that cause the processor or processors to control the displacement member, for example, the I-profile beam 764, as described. In one aspect, a timer / counter 781 provides an output signal, such as elapsed time or a digital count, to the control circuit 760 to correlate the beam position with I-shaped profile 764, as determined by the position sensor 784 with the timer / counter output 781, so that the control circuit 760 can determine the position of the I-profile beam 764 at a specific time (t) in relation to an initial position. The 781 timer / counter can be configured to measure elapsed time, count external events, or measure timeless events.
[0232] [0232] Control circuit 760 can generate a 772 motor setpoint signal. The 772 motor setpoint signal can be supplied to a 758 motor controller. The 758 motor controller can comprise one or more circuits configured to provide a motor 774 drive signal to motor 754 to drive motor 754, as described in the present invention. In some instances, the 754 motor may be a DC motor with a brushed DC electric motor. For example, the speed of motor 754 can be proportional to the drive signal of motor 774. In some instances, motor 754 can be a brushless DC electric motor and the motor drive signal 774 can comprise a PWM signal provided for a or more motor stator windings 754. In addition, in some examples, motor controller 758 may be omitted, and control circuit 760 can generate motor drive signal 774 directly.
[0233] [0233] The 754 motor can receive power from a power source
[0234] [0234] The control circuit 760 can be in communication with one or more sensors 788. The sensors 788 can be positioned on the end actuator 792 and adapted to work with the surgical instrument 790 to measure the various derived parameters, such as span distance in relation to time, compression of the tissue in relation to time and mechanical tension of the anvil in relation to time. The 788 sensors can comprise a magnetic sensor, a magnetic field sensor, a stress meter, a pressure sensor, a force sensor, an inductive sensor such as a eddy current sensor, a resistive sensor, a capacitive sensor, a sensor optical and / or any other sensors suitable for measuring one or more parameters of the end actuator 792. The 788 sensors may include one or more sensors.
[0235] [0235] The one or more sensors 788 may comprise an effort meter, such as a microstrain meter, configured to measure the magnitude of the mechanical stress on the anvil 766 during a grip condition. The voltage meter provides an electrical signal whose amplitude varies with the magnitude of the voltage. The 788 sensors can comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The 788 sensors can be configured to detect the impedance of a section of tissue located between the anvil 766 and the staple cartridge 768 which is indicative of the thickness and / or completeness of the fabric located between them.
[0236] [0236] The sensors 788 can be configured to measure the forces exerted on the anvil 766 by the closing drive system. For example, one or more sensors 788 can be at a point of interaction between a closing tube and anvil 766 to detect the closing forces applied by a closing tube to anvil 766. The forces exerted on anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various points of interaction throughout the closing drive system to detect the closing forces applied anvil 766 by the closing drive system. The one or more sensors 788 can be sampled in real time during a gripping operation by a processor portion of the control circuit 760. The control circuit 760 receives sample measurements in real time to provide and analyze information based on time and evaluate, in real time, the closing forces applied to the anvil
[0237] [0237] A current sensor 786 can be used to measure the current drained by the 754 motor. The force required to advance the beam with I-shaped profile 764 corresponds to the current drained by the motor.
[0238] [0238] An RF power source 794 is coupled to the end actuator 792 and is applied to the RF 796 cartridge when the RF 796 cartridge is loaded on the end actuator 792 instead of the staple cartridge 768. The control circuit 760 controls the supply of RF energy to the 796 RF cartridge.
[0239] [0239] Additional details are disclosed in US Patent Application Serial No. 15 / 636,096, entitled SURGICAL SYSTEM COUPLABLE WITH STAPLE CARTRIDGE AND RADIO FREQUENCY CARTRIDGE, AND METHOD OF USING SAME, filed June 28, 2017, which is hereby incorporated as a reference in its entirety. Generator hardware
[0240] [0240] Figure 20 is a simplified block diagram of a generator 800 configured to provide adjustment without inductor, among other benefits. Additional details of generator 800 are described in US Patent No. 9,060,775, entitled SURGICAL GENERATOR FOR ULTRASONIC AND ELECTROSURGICAL DEVICES, issued on June 23, 2015, which is hereby incorporated by reference in its entirety. The generator 800 may comprise a patient isolated stage 802 in communication with a non-isolated stage 804 by means of a power transformer 806. A secondary winding 808 of the power transformer 806 is contained in the isolated stage
[0241] [0241] In certain forms, ultrasonic and electrosurgical trigger signals can be delivered simultaneously to separate surgical instruments and / or to a single surgical instrument, such as the multifunctional surgical instrument, which has the ability to deliver both ultrasonic and electrosurgical energy to the tissue . It will be noted that the electrosurgical signal provided by both the dedicated electrosurgical instrument and the multifunctional electrosurgical / ultrasonic combined instrument can be both a therapeutic and subtherapeutic signal, where the subtherapeutic signal can be used, for example, to monitor tissue or condition of the instruments and provide feedback to the generator. For example, RF and ultrasonic signals can be provided separately or simultaneously from a generator with a single output port in order to provide the desired output signal to the surgical instrument, as will be discussed in more detail below. Consequently, the generator can combine the RF and ultrasonic electrosurgical energies and supply the combined energies to the multifunctional electrosurgical / ultrasonic instrument. Bipolar electrodes can be placed on one or both claws of the end actuator. A claw can be triggered by ultrasonic energy in addition to RF electrosurgical energy, working simultaneously. Ultrasonic energy can be used to perform tissue dissection, while RF electrosurgical energy can be used to cauterize vessels.
[0242] [0242] The non-isolated stage 804 may comprise a power amplifier 812 having an output connected to a primary winding 814 of the power transformer 806. In certain forms the power amplifier 812 may comprise a push and pull type amplifier. For example, the non-isolated stage 804 may additionally comprise a logic device 816 for providing a digital output to a digital-to-analog converter (DAC) circuit 818 which, in turn, provides an analog signal corresponding to an input from the power amplifier 812. In certain ways, the logic device 816 may comprise a programmable gate array ("PGA"), a field-programmable gate array (FPGA), a programmable logic device ( "PLD" - programmable logic device) among other logic circuits, for example. The logic device 816, because it controls the input of the power amplifier 812 through the DAC circuit 818, can therefore control any of several parameters (for example, frequency, waveform, amplitude of the waveform) of that appear at the trigger signal outputs 810a, 810b and 810c. In certain ways and as discussed below, logic device 816, in conjunction with a processor (for example, a PSD discussed below), can implement various PSD-based algorithms and / or other control algorithms for drive signal control parameters provided by generator 800.
[0243] [0243] Power can be supplied to a power rail of the power amplifier 812 by a key mode regulator 820, for example, a power converter. In certain forms, the key mode regulator 820 may comprise an adjustable antagonistic regulator, for example. Non-isolated stage 804 may further comprise a first processor 822 which, in one form, may comprise a PSD processor, such as an analog device ADSP-21469 SHARC PSD, available from Analog Devices, Norwood, MA, USA, for example, although any suitable processor can be used in several ways. In certain ways, the PSD processor 822 can control the operation of the key mode regulator 820 responsive to voltage feedback information received from the power amplifier 812 by the PSD processor 822 via an ADC 824 circuit. For example, the PSD 822 processor can receive the waveform envelope of a signal (for example, an RF signal) as input, which is amplified by the power amplifier 812, via the ADC 824 circuit. The PSD 822 processor you can then control the key mode regulator 820 (for example, via a PWM output), so that the rail voltage supplied to the power amplifier 812 follows the waveform envelope of the amplified signal. By dynamically modulating the rail voltage of the power amplifier 812 based on the waveform envelope, the efficiency of the power amplifier 812 can be significantly improved over amplifier schemes with fixed rail voltage.
[0244] [0244] In certain forms, the logic device 816, in conjunction with the PSD 822 processor, can implement a digital synthesis circuit as a control scheme with direct digital synthesizer to control the waveform, frequency and / or amplitude of the trigger signals emitted by the generator 800. In one way, for example, the logic device 816 can implement a DDS control algorithm by retrieving waveform samples stored in a lookup table ("LUT" - look -up table) dynamically updated, like a RAM LUT that can be integrated into an FPGA.
[0245] [0245] The non-isolated stage 804 can additionally comprise a first converter circuit AD 826 and a second circuit converter AD 828 coupled to the output of the power transformer 806 by means of the respective isolation transformers 830 and 832 to sample the voltage and the current of drive signals emitted by the generator 800. In certain ways, the AD 826 and 828 converter circuits can be configured for sampling at high speeds [for example, 80 mega samples per second ("MSPS" - mega samples per second) ] to allow oversampling of the trigger signals. In one way, for example, the sampling speed of the A-D converter circuits 826 and 828 can allow an oversampling of approximately 200x (depending on the frequency) of the drive signals. In certain ways, the sampling operations of the A-D converter circuit 826 and 828 can be performed by a single A-D converter circuit that receives input voltage and current signals through a bidirectional multiplexer. The use of high-speed sampling in the forms of generator 800 can allow, among other things, the calculation of the complex current flowing through the branch of motion (which can be used in certain ways to implement DDS-based waveform control described above), the exact digital filtering of the sampled signals and the calculation of actual energy consumption with a high degree of accuracy. The feedback information about voltage and current emitted by the AD 826 and 828 converter circuits can be received and processed [for example, first-in-first-out temporary storage ("FIFO" - multiplexer] by logic device 816 and stored in data memory for subsequent retrieval, for example, by processor 822. As noted above, feedback data on voltage and current can be used as input to an algorithm for pre-distortion or modification of waveform samples in the LUT, dynamically and continuously. In some ways, this may require that each pair of stored voltage and current feedback data be indexed based on, or otherwise associated with, a sample of the LUT corresponding that was provided by logic device 816 when the voltage and current feedback data pair was captured. The synchronization of the LUT samples with the data feedback on voltage and current in this way contributes to the correct timing and stability of the pre-distortion algorithm.
[0246] [0246] In certain forms, voltage and current feedback data can be used to control the frequency and / or amplitude (for example, current amplitude) of the drive signals. In one way, for example, feedback data about voltage and current can be used to determine the impedance phase. The frequency of the trigger signal can then be controlled to minimize or reduce the difference between the determined impedance phase and an impedance phase setpoint (eg 0 °), thereby minimizing or reducing the effects of harmonic distortion and, correspondingly, accentuating the accuracy of the impedance phase measurement. The determination of the phase impedance and a frequency control signal can be implemented in the PSD 822 processor, for example, with the frequency control signal being supplied as input to a DDS control algorithm implemented by the logic device 816.
[0247] [0247] In another form, for example, the current feedback data can be monitored in order to maintain the current amplitude of the drive signal at a current amplitude setpoint. The current amplitude set point can be specified directly or indirectly determined based on the specified set points for voltage and power amplitude. In certain ways, the control of the current amplitude can be implemented by the control algorithm, such as, for example, a proportional-integral-derivative control algorithm (PID), in the PSD 822 processor. The variables controlled by the control algorithm for properly controlling the current amplitude of the drive signal may include, for example, scaling the LUT waveform samples stored in logic device 816 and / or the full-scale output voltage of the DAC 818 circuit (which provides input to the power amplifier 812) via a DAC 834 circuit.
[0248] [0248] The non-isolated stage 804 may additionally comprise a second processor 836 to provide, among other things, user interface (UI) functionality. In one form, the UI 836 processor can comprise an Atmel AT91SAM9263 processor with an ARM 926EJ-S core, available from Atmel Corporation, of San Jose, CA, USA, for example. Examples of UI functionality supported by the UI 836 processor may include audible and visual feedback from the user, communication with peripheral devices (for example, via a USB interface),
[0249] [0249] In certain ways, both the PSD 822 processor and the UI 836 processor can, for example, determine and monitor the operational status of generator 800. For the PSD 822 processor, the operational state of generator 800 can determine, for example, which control and / or diagnostic processes are implemented by the PSD 822 processor. For the UI 836 processor, the operational state of generator 800 can determine, for example, which elements of a UI (for example, display screens , sounds) are presented to a user. The respective UI and PSD processors 822 and 836 can independently maintain the current operational state of the generator 800 and recognize and evaluate possible transitions outside the current operational state. The PSD 822 processor can act as the master in this relationship and can determine when transitions between operational states should occur. The UI 836 processor can be aware of valid transitions between operational states and can confirm that a particular transition is adequate. For example, when the PSD 822 processor instructs the UI 836 processor to transition to a specific state, the UI 836 processor can verify that the requested transition is valid. If a requested transition between states is determined to be invalid by the UI 836 processor, the UI 836 processor can cause generator 800 to enter a fault mode.
[0250] [0250] The non-isolated platform 804 may additionally comprise an 838 controller for monitoring input devices (for example, a capacitive touch sensor used to turn the generator 800 on and off, a capacitive touch screen). In certain ways, controller 838 may comprise at least one processor and / or other controller device in communication with the UI processor 836. In one form, for example, controller 838 may comprise a processor (for example, a Meg168 controller of 8 bits available from Atmel) configured to monitor the information entered by the user through one or more capacitive touch sensors. In one way, the 838 controller can comprise a touchscreen controller (for example, a QT5480 touchscreen controller available from Atmel) to control and manage the capture of touch data from a capacitive touchscreen to the touch.
[0251] [0251] In certain forms, when generator 800 is in an "off" state, controller 838 can continue to receive operational power (for example, through a line from a generator 800 power supply, such as the power supply 854 discussed below). In this way, controller 838 can continue to monitor an input device (for example, a capacitive touch sensor located on a front panel of generator 800) to turn generator 800 on and off. When generator 800 is in the off state, the controller 838 can activate the power supply (for example,
[0252] [0252] In certain forms, controller 838 may cause generator 800 to provide audible feedback or other sensory feedback to alert the user that an on or off sequence has been initiated. This type of alert can be provided at the beginning of an on or off sequence, and before the start of other processes associated with the sequence.
[0253] [0253] In certain forms, the isolated stage 802 may comprise an instrument interface circuit 840 to, for example, provide a communication interface between a control circuit of a surgical instrument (for example, a control circuit comprising switches handle) and non-isolated stage components 804, such as logic device 816, PSD processor 822 and / or UI processor 836. Instrument interface circuit 840 can exchange information with non-stage components isolated 804 by means of a communication link that maintains an adequate degree of electrical isolation between the isolated and non-isolated stages 802 and 804, such as, for example, an IR-based communication link. Power can be supplied to the instrument interface circuit 840 using, for example, a low-drop voltage regulator powered by an isolation transformer driven from the non-isolated stage 804.
[0254] [0254] In one form, the instrument interface circuit 840 may comprise a logic circuit 842 (for example, a logic circuit, a programmable logic circuit, PGA, FPGA, PLD) in communication with a signal conditioning circuit 844. Signal conditioning circuit 844 can be configured to receive a periodic signal from logic circuit 842 (e.g., a 2 kHz square wave) to generate a bipolar interrogation signal that has an identical frequency. The question mark can be generated, for example, using a bipolar current source powered by a differential amplifier. The question mark can be communicated to a surgical instrument control circuit (for example, using a conductive pair on a cable that connects the generator 800 to the surgical instrument) and monitored to determine a state or configuration of the control circuit. control. The control circuit may comprise numerous switches, resistors and / or diodes to modify one or more characteristics (for example, amplitude, rectification) of the question mark so that a state or configuration of the control circuit is unambiguously discernible with based on one or more characteristics. In one form, for example, signal conditioning circuit 844 may comprise an A-D converter circuit for generating samples of a voltage signal that appears between control circuit inputs that result from passing the interrogation signal through it. Logic instrument 842 (or a non-isolated stage component
[0255] [0255] In one form, the instrument interface circuit 840 may comprise a first data circuit interface 846 to enable the exchange of information between logic circuit 842 (or another element of the instrument interface circuit 840) and a first data circuit disposed in a surgical instrument or otherwise associated with it. In certain forms, for example, a first data circuit may be arranged on a cable integrally attached to a handle of the surgical instrument or on an adapter to interface between a specific type or model of surgical instrument and the generator 800. The first The data circuit can be deployed in any suitable manner and can communicate with the generator in accordance with any suitable protocol, including, for example, as described here with respect to the first data circuit. In certain forms, the first data circuit may comprise a non-volatile storage device, such as an EEPROM device. In certain ways, the first data circuit interface 846 can be implemented separately from logic circuit 842 and comprises a suitable circuit (for example, separate logic devices, a processor) to allow communication between logic circuit 842 and the first circuit of Dice. In other forms, the first data circuit interface 846 can be integral with logic circuit 842.
[0256] [0256] In certain forms, the first data circuit can store information related to the specific surgical instrument with which it is associated. This information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument was used, and / or any other types of information. This information can be read by the instrument interface circuit 840 (for example, the logic circuit 842), transferred to a non-isolated stage component 804 (for example, to the logic device 816, PSD processor 822 and / or processor UI 836) for presentation to a user by means of an output device and / or to control a function or operation of the generator 800. Additionally, any type of information can be communicated to the first data circuit for storage in the same via the first interface of data circuit 846 (for example, using logic circuit 842). This information may include, for example, an updated number of operations in which the surgical instrument was used and / or the dates and / or times of its use.
[0257] [0257] As discussed earlier, a surgical instrument can be removable from a handle (for example, the multifunctional surgical instrument can be removable from the handle) to promote interchangeability and / or disposability of the instrument. In such cases, conventional generators may be limited in their ability to recognize specific instrument configurations being used, as well as to optimize the control and diagnostic processes as needed. The addition of readable data circuits to surgical instruments to address this issue is problematic from a compatibility point of view, however. For example, designing a surgical instrument so that it remains backward compatible with generators that lack the indispensable data reading functionality may be impractical due, for example, to different signaling schemes, design complexity and cost. The forms of instruments discussed here address these concerns through the use of data circuits that can be implemented in existing surgical instruments, economically and with minimal design changes to preserve the compatibility of surgical instruments with current generator platforms.
[0258] [0258] Additionally, the forms of the generator 800 can enable communication with instrument-based data circuits. For example, generator 800 can be configured to communicate with a second data circuit contained in an instrument (for example, the multifunctional surgical device). In some ways, the second data circuit can be implemented in a manner similar to that of the first data circuit described here. The instrument interface circuit 840 may comprise a second data circuit interface 848 to enable such communication. In one form, the second data circuit interface 848 can comprise a three-state digital interface, although other interfaces can also be used. In certain ways, the second data circuit can generally be any circuit for transmitting and / or receiving data. In one form, for example, the second data circuit can store information related to the specific surgical instrument with which it is associated. This information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument was used, and / or any other types of information.
[0259] [0259] In some ways, the second data circuit can store information about the ultrasonic and / or electronic properties of an associated ultrasonic transducer, end actuator or ultrasonic drive system. For example, the first data circuit can indicate an initialization frequency slope, as described here. In addition or alternatively, any type of information can be communicated to the second data circuit for storage in it via the second data circuit interface 848 (for example, using logic circuit 842). This information may include, for example, an updated number of operations in which the surgical instrument was used and / or the dates and / or times of its use. In certain ways, the second data circuit can transmit data captured by one or more sensors (for example, an instrument-based temperature sensor). In certain ways, the second data circuit can receive data from generator 800 and provide an indication to a user (for example, a light-emitting indication or other visible indication) based on the received data.
[0260] [0260] In certain ways, the second data circuit and the second data circuit interface 848 can be configured so that communication between logic circuit 842 and the second data circuit can be carried out without the need to provide additional conductors for this purpose (for example, dedicated conductors of a cable that connects a handle to the generator 800). In one way, for example, information can be communicated to and from the second data circuit using a wire bus communication scheme implemented in existing wiring, as one of the conductors used transmitting interrogation signals from the signal conditioning circuit 844 to a control circuit on a handle. In this way, changes or modifications to the design of the surgical device that may otherwise be necessary are minimized or reduced. In addition, due to the fact that different types of communications implemented on a common physical channel can be separated based on frequency, the presence of a second data circuit can be "invisible" to generators that do not have the essential functionality of reading data, which, therefore, allows the backward compatibility of the surgical instrument.
[0261] [0261] In certain forms, the isolated stage 802 may comprise at least one blocking capacitor 850-1 connected to the trigger signal output 810b to prevent the passage of direct current to a patient. A single blocking capacitor may be required to comply with medical regulations and standards, for example. Although failures in single-capacitor designs are relatively uncommon, such failures can still have negative consequences. In one form, a second blocking capacitor 850-2 can be supplied in series with the blocking capacitor 850-1, with current dispersion of one point between the blocking capacitors 850-1 and 850-2 being monitored, for example , by an AD 852 converter circuit for sampling a voltage induced by leakage current. Samples can be received, for example, via logic circuit 842. Based on changes in the scattering current (as indicated by the voltage samples), generator 800 can determine when one of the 850-1 or 850-2 blocking capacitors has failed , thereby providing a benefit over single capacitor designs that have a single point of failure.
[0262] [0262] In certain forms, the non-isolated stage 804 may comprise a power supply 854 for delivering direct current power with a suitable voltage and current. The power supply may comprise, for example, a 400 W power supply to deliver a system voltage of 48 VDC. The power supply 854 may additionally comprise one or more DC / DC voltage converters 856 to receive the power supply output in order to generate direct current outputs at the voltages and currents required by the various components of the generator 800. As discussed above in With respect to controller 838, one or more of the 856 DC / DC voltage converters can receive an input from controller 838 when the activation of the "on / off" input device by a user is detected by controller 838, to enable operation or the activation of the 856 DC / DC voltage converters.
[0263] [0263] Figure 21 illustrates an example of generator 900, which is a form of generator 800 (Figure 20). The 900 generator is configured to supply multiple types of energy to a surgical instrument. The 900 generator provides ultrasonic and RF signals to power a surgical instrument, independently or simultaneously. Ultrasonic and RF signals can be provided alone or in combination and can be provided simultaneously. As indicated above, at least one generator output can provide multiple types of energy (for example, ultrasonic, bipolar or monopolar RF, irreversible and / or reversible electroporation, and / or microwave energy, among others) through a single port, and these signals can be supplied separately or simultaneously to the end actuator to treat tissue.
[0264] [0264] Generator 900 comprises a processor 902 coupled to a waveform generator 904. Processor 902 and waveform generator 904 are configured to generate various signal waveforms based on information stored in a coupled memory to processor 902, not shown for clarity of description. The digital information associated with a waveform is provided to the waveform generator 904 that includes one or more DAC circuits to convert the digital input to an analog output. The analog output is powered by an amplifier 1106 for signal conditioning and amplification. The conditioned and amplified output of the amplifier 906 is coupled to a power transformer 908. The signals are coupled via the power transformer 908 to the secondary side, which is on the patient isolation side. A first signal of a first energy modality is supplied to the surgical instrument between the terminals identified as
[0265] [0265] A first voltage detection circuit 912 is coupled through the terminals identified as ENERGY1 and the RETURN path to measure the output voltage between them. A second voltage detection circuit 924 is coupled through the terminals identified as ENERGY2 and the RETURN path to measure the output voltage between them. A current detection circuit 914 is arranged in series with the RETURN leg on the secondary side of the power transformer 908 as shown to measure the output current for any energy modality. If different return paths are provided for each energy modality, then a separate current detection circuit would be provided on each return leg. The outputs of the first and second voltage detection circuits 912, 924 are supplied to the respective isolation transformers 916, 922 and the output of the current detection circuit 914 is supplied to another isolation transformer 918. The outputs of the isolation transformers 916 , 928, 922 on the primary side of the power transformer 908 (non-isolated side of the patient) are supplied to one or more ADC 926 circuits. The digitized output from the ADC 926 circuit is provided to processor 902 for further processing and computation. The output voltages and the output current feedback information can be used to adjust the output voltage and the current supplied to the surgical instrument, and to compute the output impedance, among other parameters. Input / output communications between the 902 processor and the patient's isolated circuits are provided via a 920 interface circuit. The sensors may also be in electrical communication with the 902 processor via the 920 interface circuit.
[0266] [0266] In one aspect, impedance can be determined by processor 902 by dividing the output of the first voltage detection circuit 912 coupled through the terminals identified as ENERGY1 / RETURN or the second voltage detection circuit 924 coupled through the terminals identified as ENERGY2 / RETURN, by the output of the current detection circuit 914 arranged in series with the RETURN leg on the secondary side of the power transformer 908. The outputs of the first and second voltage detection circuits 912, 924 are provided to separate the transformer isolations 916, 922 and the output of the current detection circuit 914 is provided to another isolation transformer 916. Voltage and current detection measurements digitized from the AD 926 converter circuit are provided to processor 902 to compute the impedance. As an example, the first ENERGIA1 energy modality can be ultrasonic energy and the second ENERGIA2 energy modality can be RF energy. However, in addition to the ultrasonic and bipolar or monopolar RF energy modalities, other energy modalities include irreversible and / or reversible electroporation and / or microwave energy, among others. In addition, while the example shown in Figure 21 shows that a single RETURN return path can be provided for two or more energy modes, in other respects, multiple RETURN return paths can be provided for each ENERGY energy mode. Thus, as described here, the impedance of the ultrasonic transducer can be measured by dividing the output of the first voltage detection circuit 912 by the current detection circuit 914, and the tissue impedance can be measured by dividing the output of the second voltage detection circuit 924 by current detection circuit 914.
[0267] [0267] As shown in Figure 21, generator 900 comprising at least one output port can include a single-output, multi-tap power transformer 908 to provide power in the form of one or more energy modalities, such as ultrasonic, bipolar or monopolar RF, irreversible and / or reversible electroporation, and / or microwave energy, among others, for example, to the end actuator depending on the type of tissue treatment that is performed. For example, the 900 generator can supply higher voltage and lower current power to drive an ultrasonic transducer, lower voltage and higher current to drive RF electrodes to seal the tissue or with a coagulation waveform for point clotting using electrosurgical electrodes Monopolar or bipolar RF. The output waveform of generator 900 can be oriented, switched or filtered to provide frequency to the end actuator of the surgical instrument. The connection of an ultrasonic transducer to the output of generator 900 would preferably be located between the output identified as ENERGY1 and RETURN, as shown in Figure 21. In one example, a connection of bipolar RF electrodes to the generator output 900 would preferably be located between the exit identified as ENERGY2 and the RETURN. In the case of a monopolar output, the preferred connections would be an active electrode (for example, light beam or other probe) for the ENERGIA2 output and a suitable return block connected to the RETURN output.
[0268] [0268] Additional details are revealed in US Patent Application Publication No. 2017/0086914 entitled TECHNIQUES FOR OPERATING GENERATOR FOR DIGITALLY GENERATING
[0269] [0269] As used throughout this description, the term "wireless" and its derivatives can be used to describe circuits, devices, systems, methods, techniques, communication channels etc., which can communicate data through the use of electromagnetic radiation modulated using a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some ways they may not. The communication module can implement any of a number of wireless and wired communication standards or protocols, including, but not limited to, Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, evolution long-term evolution (LTE), Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other protocols without wired and wired which are designated as 3G, 4G, 5G, and beyond. The computing module can include a plurality of communication modules. For example, a first communication module can be dedicated to short-range wireless communications like Wi-Fi and Bluetooth, and a second communication module can be dedicated to longer-range wireless communications like GPS, EDGE, GPRS, CDMA , WiMAX, LTE, Ev-DO, and others.
[0270] [0270] As used in the present invention, a processor or processing unit is an electronic circuit that performs operations on some external data source, usually memory or some other data flow. The term is used in the present invention to refer to the central processor (central processing unit) in a computer system or systems (specifically systems on a chip (SoCs)) that combine several specialized "processors".
[0271] [0271] As used here, a system on a chip or system on the chip (SoC or SOC) is an integrated circuit (also known as an "IC" or "chip") that integrates all components of a computer or other electronic systems . It can contain digital, analog, mixed and often radio frequency functions - all on a single substrate. A SoC integrates a microcontroller (or microprocessor) with advanced peripherals such as a graphics processing unit (GPU), i-Fi module, or coprocessor. An SoC may or may not contain internal memory.
[0272] [0272] As used here, a microcontroller or controller is a system that integrates a microprocessor with peripheral circuits and memory. A microcontroller (or MCU for microcontroller unit) can be implemented as a small computer on a single integrated circuit. It can be similar to a SoC; a SoC can include a microcontroller as one of its components. A microcontroller can contain one or more core processing units (CPUs) along with memory and programmable input / output peripherals. Program memory in the form of ferroelectric RAM, NOR flash or OTP ROM is also often included on the chip, as well as a small amount of RAM. Microcontrollers can be used for integrated applications, in contrast to microprocessors used in personal computers or other general purpose applications that consist of several separate integrated circuits.
[0273] [0273] As used in the present invention, the term controller or microcontroller may be an independent chip or IC (integrated circuit) device that interfaces with a peripheral device. This can be a connection between two parts of a computer or a controller on an external device that manages the operation of (and connection to) that device.
[0274] [0274] Any of the processors or microcontrollers in the present invention can be any implemented by any single-core or multi-core processor, such as those known under the trade name ARM Cortex by Texas Instruments. In one respect, the processor may be a Core Cortex-M4F LM4F230H5QR ARM processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz , a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWare® program, memory only programmable and electrically erasable reading (EEPROM) of 2 KB, one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, one or more analog to digital converters (ADC) 12-bit with 12 channels of analog input, details of which are available for the product data sheet.
[0275] [0275] In one aspect, the processor may comprise a safety controller that comprises two controller-based families, such as TMS570 and RM4x, known under the tradename Hercules ARM Cortex R4, also by Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options.
[0276] [0276] Modular devices include modules (as described in connection with Figures 3 and 9, for example) that are receivable within a central surgical controller and the devices or surgical instruments that can be connected to the various modules in order to connect or pair with the corresponding central surgical controller. Modular devices include, for example, smart surgical instruments, medical imaging devices, suction / irrigation devices, smoke evacuators, power generators, fans, insufflators and displays. The modular devices described here can be controlled by control algorithms. The control algorithms can be executed on the modular device itself, on the central surgical controller to which the specific modular device is paired, or on both the modular device and the central surgical controller (for example, through a distributed computing architecture). In some examples, the control algorithms of the modular devices control the devices based on the data detected by the modular device itself (that is, by sensors on, over or connected to the modular device). This data can be related to the patient being operated on (for example, tissue properties or insufflation pressure) or to the modular device itself (for example, the rate at which a knife is being advanced, the motor current, or the levels of energy). For example, a control algorithm for a surgical stapling and cutting instrument can control the rate at which the instrument's motor drives its knife through the fabric according to the resistance encountered by the knife as it progresses.
[0277] [0277] During a surgical procedure, it may be necessary for a surgeon to manipulate tissues to achieve a desired medical result. The surgeon's actions are limited by what is visually observable at the surgical site. Thus, the surgeon may not be aware, for example, of the arrangement of vascular structures under the tissues being manipulated during the procedure. Since the surgeon is unable to view the vasculature under a surgical site, the surgeon may accidentally cut one or more blood vessels during the procedure. The solution is a surgical visualization system that can capture imaging data from the surgical site for presentation to a surgeon, whose presentation may include information related to the presence and depth of vascular structures located under the surface of a surgical site.
[0278] [0278] In one aspect, the central surgical controller 106 incorporates a visualization system 108 to capture imaging data during a surgical procedure. The visualization system 108 can include one or more light sources and one or more light sensors. One or more light sources and one or more light sensors may be incorporated together into a single device or may comprise one or more separate devices. One or more light sources can be directed to illuminate portions of the surgical field. The one or more image sensors can receive reflected or refracted light from the surgical field, including reflected or refracted light from tissue and / or surgical instruments. The following description includes all hardware and software processing techniques disclosed above and in those orders incorporated herein by reference as presented above.
[0279] [0279] In some respects, the visualization system 108 can be integrated with a surgical system 100 as revealed above and represented in Figures 1 and 2. In addition to the visualization system 108, surgical system 100 can include one or more portable instruments intelligent 112, a multifunctional robotic system 110, one or more visualization systems 108 and a centralized central surgical controller system 106, among other components. The centralized central surgical controller system 106 can control various functions as revealed above and also depicted in Figure 3. In a non-limiting example, such functions may include supplying power to any number of surgical devices equipped with an engine and controlling the energy supplied to them . In another non-limiting example, such functions may include controlling the fluid delivered to and evacuated from the surgical site. The centralized central surgical controller system 106 can also be configured to manage and analyze data received from any of the components of the surgical system as well as to communicate data and other information between the components of the surgical system. The centralized central surgical controller system 106 may also be in data communication with a cloud computing system 104 as disclosed above and represented, for example, in Figure 1.
[0280] [0280] In some non-limiting examples, the imaging data generated by the visualization system 108 can be analyzed by embedded computer components of the visualization system 108, and the analysis results can be communicated to the centralized central surgical controller 106. In examples not alternative limiters, the imaging data generated by the visualization system 108 can be communicated directly to the centralized central surgical controller 106, where the data can be analyzed by computational components in the centralized controller system 106. The centralized central surgical controller 106 can communicate the results of image analysis to any one or more of the other components of the surgical system. In some other non-limiting examples, the centralized central surgical controller can communicate the image data and / or the results of the image analysis to the cloud computing system 104.
[0281] [0281] Figures 22A to 22D and Figures 23A to 23F represent various aspects of an example of a 2108 display system that can be incorporated into a surgical system. The 2108 display system may include an image control unit 2002 and a manual unit 2020. The image control unit 2002 may include one or more light sources, a power source for one or more light sources, a or more types of data communication interfaces (including USB, Ethernet or wireless interfaces 2004) and one or more video outputs 2006. The Imaging Control Unit 2002 can additionally include an interface, such as a USB 2010 interface, configured for transmit integrated image and video capture data to a USB enabled device. The imaging control unit 2002 may also include one or more computational components including, without limitation, a processor unit, a transient memory unit, a non-transient memory unit, an image processing unit, a bus structure for form data connections between computational components and any interface devices (for example, input and / or output) needed to receive and transmit information to components not included in the imaging control unit. The non-transient memory can additionally contain instructions which, when executed by the processor unit, can perform any number of data manipulations that can be received from the 2020 manual unit and / or from computational devices not included in the imaging control unit.
[0282] [0282] Light sources may include a 2012 white light source and one or more laser light sources. The 2002 imaging control unit includes one or more optical and / or electrical interfaces for optical and / or electrical communication with the manual unit
[0283] [0283] In a non-limiting aspect, the 2020 hand unit can include a 2021 body, a 2015 scope scope cable attached to the 2021 body and a 2024 elongated camera probe. The 2021 body of the 2020 hand unit can include control buttons 2022 manual unit or other controls to enable a healthcare professional using the 2020 manual unit to control the operations of the 2020 manual unit or other components of the 2002 imaging control unit, including, for example, light sources. The 2015 camera scope cable may include one or more electrical conductors and one or more optical fibers. The 2015 camera scope cable can end with a 2008 camera head connector at a proximal end, where the 2008 camera head connector is configured to fit with one or more optical and / or electrical interfaces of the control unit 2002. Electrical conductors can supply power to the 2020 hand unit, which includes the 2021 body and the 2024 elongated camera probe, and / or any electrical components within the 2020 hand unit which include the 2021 body and / or the probe with 2024 elongated camera. The electrical conductors can also serve to provide bidirectional data communication between any one or more components of the 2020 manual unit and the 2002 imaging control unit. One or more optical fibers can lead to the illumination of one or more lighting sources in the 2002 imaging control unit through the body of the 2021 hand unit and up to a distal end of the 2024 elongated camera probe. In some respects non-limiting devices, one or more optical fibers can also conduct reflected or refracted light from the surgical site to one or more optical sensors arranged in the probe with elongated camera 2024, in the body of the manual unit 2021 and / or in the imaging control unit 2002 .
[0284] [0284] Figure 22B (a top plan view) represents in more detail some aspects of a 2020 hand unit of the 2108 display system. The body of the 2021 hand unit can be constructed of a plastic material. The control buttons on the 2022 hand unit or other controls can be molded with rubber overlay to protect the controls while allowing them to be manipulated by the surgeon. The 2015 camera scope cable may have optical fibers integrated with electrical conductors, and the 2015 camera scope cable may have a protective and flexible outer shell, such as PVC. In some non-limiting examples, the 2015 camera scope cable can be about 10 feet long to allow ease of use during a surgical procedure. The 2015 camera scope cable length can range from about 5 feet to about 15 feet. Non-limiting examples of a 2015 camera scope cable length may include about 5 feet, about 6 feet, about 7 feet, about 8 feet, about 9 feet, about 10 feet, about 11 feet, about 12 feet, about 13 feet, about 14 feet, about 15 feet or any length or range of lengths between them. The 2024 elongated camera probe can be manufactured from a rigid material such as stainless steel. In some respects, the 2024 elongated camera probe can be joined with the 2021 hand unit body using a 2026 rotating collar. The 2026 rotating collar can enable the 2024 elongated camera probe to be rotated relative to the 2021 hand unit body. In some respects, the 2024 elongated camera probe can end at a distal end with an epoxy-sealed 2028 plastic window.
[0285] [0285] The side plan view of the hand unit, shown in Figure 22C, illustrates that a 2030 light or image sensor can be arranged at a distal end 2032a of the probe with the camera's elongate unit or within the body of the hand unit 2032b . In some alternative aspects, the light or image sensor 2030 can be arranged with additional optical elements in the imaging control unit 2002. Figure 22C additionally shows an example of a light sensor 2030 comprising a CMOS image sensor 2034 arranged inside of a 2036 bezel with a radius of about 4 mm. Figure 22D illustrates aspects of the CMOS image sensor 2034 representing the active area 2038 of the image sensor. Although the CMOS image sensor in Figure 22C is shown disposed within a 2036 bezel that has a radius of about 4 mm, it can be recognized that such a combination of sensor and bezel can be of any size useful to be arranged within the probe with 2024 elongated camera, 2021 manual unit or 2002 image control unit. Some non-limiting examples of such alternative bezels may include a 5.5mm bezel 2136a, a 4mm bezel 2136b, a 2.7mm bezel 2136c and a 2 mm 2136d bezel. It can be recognized that the image sensor may also comprise a CCD image sensor. The CMOS or CCD sensor can comprise an array of individual light sensing elements (pixels).
[0286] [0286] Figures 23A to 23F represent various aspects of some examples of light sources and their control that can be incorporated into the 2108 visualization system.
[0287] [0287] Figure 23A illustrates an aspect of a laser lighting system that has a plurality of laser beams that emit a plurality of wavelengths of electromagnetic energy. As can be seen in the figure, the lighting system 2700 can comprise a red laser beam 2720, a green laser beam 2730 and a blue laser beam 2740 which are all optically coupled together via optical fiber 2755. As can be seen seen in the figure, each of the laser beams can have a light detection element or corresponding electromagnetic sensor 2725, 2735, 2745, respectively, to detect the output of the specific laser beam or wavelength.
[0288] [0288] Additional disclosures regarding the laser lighting system shown in Figure 23A for use in a 2108 surgical display system can be found in US Patent Application Publication No. 2014/0268860, entitled CONTROLLING THE INTEGRAL LIGHT ENERGY OF A LASER PULSE, filed on March 15, 2014, which was granted on October 3, 2017 as US patent No. 9,777,913, the content of which is incorporated by reference in its entirety and for all purposes.
[0289] [0289] Figure 23B illustrates the operating cycles of a sensor used in the bearing reading mode. It will be understood that the x direction corresponds to time and that diagonal lines 2202 indicate the activity of an internal cursor that reads each frame of data, one line at a time. The same cursor is responsible for resetting each row of pixels for the next exposure period. The network integration time for each row 2219a to 2219c is equivalent, but they are misaligned in time with respect to each other due to the reset and the bearing reading process. Therefore, for any scenario in which adjacent frames are needed to represent different constitutions of light, the only option for having each row consistent is to pulse the light between reading cycles 2230a to 2230c. More specifically, the maximum available period corresponds to the sum of the suppression time plus any time during which the black or optically blind (OC) rows (2218, 2220) are reviewed at the beginning or at the end of the table.
[0290] [0290] Figure 23B illustrates the operating cycles of a sensor used in the bearing reading mode or when reading the sensor
[0291] [0291] As shown in Figure 23B, these black optical rows 2218 and 2220 may be located at the top of the pixel array or at the bottom of the pixel array, or at the top and bottom of the pixel array.
[0292] [0292] It should be noted that the condition of having a light pulse 2230a to 2230c to be read only in one frame and that does not interfere in neighboring frames is to have the given light pulse 2230a to 2230c firing during the suppression time 2216. Because the black optical rows 2218 and 2220 are insensitive to light, the frame time (m) of the rear black optical rows 2220 and the frame time (m + 1) of the front black optical rows 2218 can be added to the suppression time 2216 to determine the maximum trigger time range of the 2230 light pulse.
[0293] [0293] In some respects, Figure 23B shows an example of a timing diagram for capturing sequential frames by a conventional CMOS sensor. Such a CMOS sensor can incorporate a standard of Bayer color filters, as shown in Figure 23C. It is recognized that the Bayer standard provides more details of luminance than chrominance. It can be further recognized that the sensor has a reduced spatial resolution, since a total of 4 adjacent pixels is required to produce the color information for the aggregated spatial portion of the image. In an alternative approach, the color image can be constructed by high-speed rapid stroboscopy of the visualized area with a variety of optical sources (laser diodes or light emitting diodes) with different central optical wavelengths.
[0294] [0294] The strobe optical system may be under the control of the camera system, and may include a specially designed high speed CMOS sensor. The main benefit is that the sensor can achieve the same spatial resolution with significantly fewer pixels compared to conventional Bayer 3-sensor cameras. Therefore, the physical space occupied by the pixel array can be reduced. Actual pulse periods (2230a to 2230c) may differ within the repeating pattern, as shown in Figure 23B. This is useful, for example, to give more time to components that require the most light energy or those that have the weakest sources. As long as the average frame rate captured is an integer multiple of the final frame's indispensable frame rate, the data can simply be buffered in the signal processing chain as appropriate.
[0295] [0295] The ease of reducing the area of the CMOS sensor's integrated circuit to the extent allowed by the combination of all these methods is particularly attractive for a small diameter endoscopy (~ 3 to 10 mm). In particular, it allows for endoscope designs where the sensor is located at the distal end constricted in space, thereby greatly reducing the complexity and cost of the optical section, while providing high-definition video. The consequence of this approach is that reconstructing each final full color image requires data to be fused from three snapshots in time. Any movement within the scene, in relation to the optical frame of reference of the endoscope will generally degrade the perceived resolution, since the edges of the objects appear in slightly different locations within each captured component. This description describes a way to reduce this problem, which explores the fact that spatial resolution is much more important for luminance information than for chrominance.
[0296] [0296] The basis of the approach is that, instead of firing monochromatic light during each frame, combinations of the three wavelengths are used to provide all luminance information within a single image. Chrominance information is derived from separate frames with, for example, a repeat pattern like Y-Cb-Y-Cr (Figure 23D). Although it is possible to provide pure luminance data by an astute choice of pulse ratios, the same is not true in the case of chrominance.
[0297] [0297] In one aspect, as illustrated in Figure 23D, an endoscopic system 2300a can comprise a pixel array 2302a with uniform pixels, and the 2300a system can be operated to receive Y pulses (luminance pulse) 2304a, Cb (ChromaBlue) 2306a and Cr (ChromaRed) 2308a.
[0298] [0298] To complete a complete color image, it is necessary that the two chrominance components are also provided. However, the same algorithm that was applied for luminance cannot be applied directly to chrominance images once it is signed, as reflected in the fact that some of the RGB coefficients are negative. The solution to this is to add a degree of luminance of sufficient magnitude so that all final pulse energies become positive. As long as the color fusion process at the ISP is aware of the composition of the chrominance frames, they can be decoded by subtracting the appropriate amount of luminance from a neighboring frame. The proportions of pulse energy are given by: Y = 0.183 · R + 0.614 · G + 0.062 · B Cb = λ · Y - 0.101 · R - 0.339 · G + 0.439 · B Cr = δ · Y + 0.439 · R - 0.399 · G - 0.040 · B where λ ≥ 0.399 / 0.614 = 0.552 δ ≥ 0.399 / 0.614 = 0.650
[0299] [0299] It appears that, if the factor λ is equal to 0.552, the red and green components are exactly canceled, in which case the Cb information can be provided with pure blue light. Similarly, the setting δ = 0.650 cancels the blue and green components for Cr, which becomes pure red. This specific example is illustrated in Figure 23E, which also represents λ and δ as integers multiple of 1/28. This is a convenient approach for reconstructing digital pictures.
[0300] [0300] In the case of the Y-Cb-Y-Cr pulse scheme, the image data is already in the YCbCr space after the color fusion. So in this case, it makes sense to perform luminance and chrominance based on the initial operations, before converting back to linear RGB to perform color correction, etc.
[0301] [0301] The color fusion process is simpler than "demosaicing" (color interpolation), which is required by the Bayer standard (see Figure 23C), since there is no spatial interpolation. This will require the buffering of frames in order to have all the necessary information available for each pixel. In general, data from the Y-Cb-Y-Cr pattern can be channeled to produce a complete color image by two captured raw images. This is accomplished by using each chrominance sample twice. Figure 23F shows the specific example of a 120 Hz frame rate providing a final 60 Hz video.
[0302] [0302] Additional disclosures regarding the control of the laser components of a lighting system, as shown in Figures 23B to 23F, for use in a surgical visualization system 108 can be found in US Patent Application Publication No. 2014/0160318 , entitled YCBCR PULSED ILLUMINATION SCHEME IN A LIGHT DEFICIENT ENVIRONMENT, filed on July 26, 2013, which was granted on December 6, 2016 as US patent No. 9,516,239, and US Patent Application Publication No. 2014/0160319 , titled CONTINUOUS
[0303] [0303] During a surgical procedure, a surgeon may have to manipulate tissues to achieve a desired medical result. The surgeon's actions are limited by what is visually observable at the surgical site. Thus, the surgeon may not be aware, for example, of the arrangement of vascular structures under the tissues being manipulated during the procedure.
[0304] [0304] Since the surgeon is unable to view the vasculature under a surgical site, the surgeon may accidentally cut one or more blood vessels during the procedure.
[0305] [0305] Therefore, it is desirable to have a surgical visualization system that can capture imaging data from the surgical site for presentation to a surgeon, where the presentation may include information related to the presence of vascular structures located under the surface of a surgical site.
[0306] [0306] Some aspects of the present description also provide a control circuit configured to control the illumination of a surgical site using one or more light sources, such as laser light sources, and to receive imaging data from one or more sensors of image. In some respects, the present description provides a non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a device to detect a blood vessel in a tissue and determine its depth below the tissue surface.
[0307] [0307] In some respects, a surgical imaging system may include a plurality of light sources, where each light source is configured to emit a light that has a specified central wavelength, a light sensor configured for receiving a portion of the reflected light from a tissue sample when illuminated by one or more of the plurality of lighting sources and a computer system. The computer system can be configured to: receive data from the light sensor when the tissue sample is illuminated by each of the plurality of light sources; determine a depth location of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the plurality of light sources and calculate the visualization data for the structure and the depth of the structure. In some respects, the visualization data may have a data format that can be used by a display system, and the structure may comprise one or more vascular tissues. Acquisition of vascular images using NIR spectroscopy
[0308] [0308] In one aspect, a surgical imaging system can include a cascade of colors independent of lighting sources that comprises visible light and light outside the visible range to make images of one or more tissues within a surgical site at different at different times and at different depths. The surgical image capture system can additionally detect or calculate characteristics of reflected and / or refracted light from the surgical site. The characteristics of the light can be used to provide a composite image of the tissue within the surgical site and also to provide an analysis of underlying tissue not directly visible on the surface of the surgical site. The surgical imaging system can determine the location of tissue depth without the need for separate measuring devices.
[0309] [0309] In one aspect, the characteristic of reflected and / or refracted light from the surgical site can be an amount of light absorbance at one or more wavelengths. Various chemical components of individual tissues can result in specific patterns of light absorption that are dependent on wavelength.
[0310] [0310] In one aspect, the light sources may comprise a red laser source and a near infrared laser source, wherein the one or more tissues to be imaged may include vascular tissue, such as veins or arteries. In some respects, red laser sources (in the visible range) can be used to capture images of some aspects of underlying vascular tissue based on spectroscopy in the visible red range. In some non-limiting examples, a red laser light source can generate illumination that has a peak wavelength that can vary between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm or any value or range of values between them. In some other respects, near-infrared laser sources can be used to capture images of underlying vascular tissue based on near-infrared spectroscopy. In some non-limiting examples, a source of near-infrared laser can emit illumination with a wavelength that can vary between 750 and 3000 nm, inclusive. Non-limiting examples of a peak infrared laser wavelength may include about 750 nm, about 1,000 nm, about 1,250 nm, about 1,500 nm, about 1,750 nm, about 2,000 nm, about 2,250 nm, about 2,500 nm, about 2,750 nm, about 3,000 nm or any value or range of values between them.
[0311] [0311] Near infrared spectroscopy ("NIRS" - near infrared spectroscopy) is a non-invasive technique that allows the determination of tissue oxygenation based on the spectrophotometric quantification of oxyhemoglobin and deoxyhemoglobin inside a tissue. In some respects, NIRS can be used to capture images of vascular tissue directly based on the difference in the absorbance of illumination between the vascular tissue and the non-vascular tissue. Alternatively, vascular tissue can be indirectly visualized based on a difference in lighting absorbance of blood flow in the tissue before and after the application of physiological interventions, such as arterial and venous occlusion methods.
[0312] [0312] Instrumentation for near IR spectroscopy (NIR) can be similar to instruments for the visible UV and medium IR ranges. Such spectroscopic instruments can include a light source, a detector and a dispersive element to select a specific near-IR wavelength to illuminate the tissue sample. In some ways, the source may comprise an incandescent light source or a halogen-quartz light source. In some ways, the detector may comprise a semiconductor photodiode (for example, an InGaAs) or a photo array. In some respects, the dispersive element may comprise a prism or, more commonly, a diffraction lattice. Fourier transform NIR instruments that use an interferometer are also common, especially for wavelengths greater than about 1,000 nm. Depending on the sample, the spectrum can be measured in any reflection or transmission mode.
[0313] [0313] Figure 24 schematically represents an example of 2400 instrumentation similar to instruments for the visible UV and medium IR ranges for NIR spectroscopy. A light source 2402 can emit a wide spectral range of illumination 2404 that can fall on a dispersive element 2406 (such as a prism or diffraction lattice). The dispersive element 2406 can operate to select a narrow wavelength portion 2408 of the light emitted by the broad spectrum light source 2402, and the selected portion 2408 of the light can illuminate the fabric 2410. The light reflected from the fabric 2412 can be directed to a detector 2416 (for example, through a dichroic mirror 2414) and the intensity of the reflected light 2412 can be recorded. The wavelength of the light that illuminates the fabric 2410 can be selected by the dispersive element 2406. In some aspects, the fabric 2410 can be illuminated only by a single portion of narrow wavelength 2408 selected by the dispersive element 2406 to form the source of light 2402. In other respects, fabric 2410 can be scanned with a variety of portions of narrow wavelength 2408 through the selected dispersive element 2406. In this way, a spectroscopic analysis of the 2410 tissue can be obtained over a range of NIR wavelengths.
[0314] [0314] Figure 25 schematically represents an example of 2430 instrumentation for determining NIRS based on Fourier transform infrared imaging. In Figure 25, a laser source that emits 2432 light in the near IR range 2434 illuminates a tissue sample 2440. Light reflected 2436 by tissue 2440 is reflected 2442 by a mirror, like a dichroic mirror
[0315] [0315] An alternative to near infrared light to determine hemoglobin oxygenation would be the use of monochromatic red light to determine the absorbance characteristics of hemoglobin red light. The characteristics of red light absorbance with a central wavelength of about 660 nm by hemoglobin can indicate whether hemoglobin is oxygenated (arterial blood) or deoxygenated (venous blood).
[0316] [0316] In some alternative surgical procedures, contrast agents can be used to improve the data that is collected on oxygenation and oxygen consumption by the tissue. In a non-limiting example, NIRS techniques can be used in conjunction with a bolus injection of a nearby IV contrast agent such as indocyanine green ("ICG" - indocyanine green) which has a peak absorbance at about 800 nm. ICG has been used in some medical procedures to measure cerebral blood flow. Acquisition of vascular images using laser Doppler flowmetry
[0317] [0317] In one aspect, the characteristic of reflected and / or refracted light from the surgical site can be a Doppler effect of the wavelength of light from its light source.
[0318] [0318] Laser Doppler flowmetry can be used to visualize and characterize a flow of particles that move in relation to an effectively stationary background. In this way, the laser light scattered by moving particles, such as blood cells, may have a different wavelength than the original laser light source. In contrast, the laser light scattered over the effectively stationary background (for example, vascular tissue) may have a wavelength equal to that of the original laser light source. The change in the wavelength of the light spread from the blood cells can reflect both the direction of blood cell flow relative to the laser source and the speed of the blood cell. Figures 26A to 26C illustrate the change in the wavelength of light scattered from blood cells that may be moving away from (Figure 26A) or towards (Figure 26C) the laser light source.
[0319] [0319] In each of Figures 26A to 26C, the original illumination light 2502 is represented with a relative central wavelength of 0. It can be seen from Figure 26A that light scattered from moving blood cells in the opposite direction to the laser source 2504 it has a wavelength shifted by a certain amount 2506 to a wavelength greater than that of the laser source (and is therefore redshifted). It can also be seen from Figure 26C that the light scattered from blood cells moving towards the laser source 2508 has a wavelength shifted by a certain amount 2510 to a shorter wavelength compared to that of the source of the laser. laser (and is therefore blue shifted). The amount of wavelength shift (for example, 2506 or 2510) may depend on the rate at which blood cells move. In some respects, the amount of redshift (2506) of some blood cells can be approximately the same amount of blue shift (2510) of some other blood cells. Alternatively, the amount of redshift (2506) of some blood cells may differ from the amount of blue shift (2510) of some other blood cells. Thus, the speed of blood cells flowing in the opposite direction to the laser source, as shown in Figure 26A, may be less than the speed of blood cells flowing towards the laser source, as shown in Figure 26C, based on in the relative magnitude of wavelength shifts (2506 and 2510). In contrast, and as shown in Figure 26B, light scattered from tissue that does not move in relation to the laser light source (for example, blood vessels 2512 or non-vascular tissue 2514) may not demonstrate any change in wavelength .
[0320] [0320] Figure 27 represents an aspect of the 2530 instrumentation that can be used to detect a Doppler effect in laser light scattered from portions of a 2540 tissue. 2534 light from a 2532 laser can pass through a dichroic beam
[0321] [0321] It can be recognized that the 2542 backscattered light from the 2540 fabric can also include backscattered light from the boundary layers within the 2540 fabric and / or specific wavelength light absorption by the material within the 2540 fabric. As a result, the interference pattern observed in the 2550 detector may incorporate interference edge features of these additional optical effects and may therefore confuse the calculation of the Doppler effect if not properly analyzed.
[0322] [0322] Figure 28 shows some of these additional optical effects. It is well known that the light that travels through a first optical medium that has a first refractive index, n1, can be reflected in an interface with a second optical medium that has a second refractive index, n2. The light transmitted through the second optical medium will have a transmission angle in relation to the interface that differs from the angle of the incident light based on a difference between the refractive indices n1 and n2 (Snell's law). Figure 28 illustrates the effect of Snell's law on the light that strikes the surface of a 2150 multi-component tissue, as can be seen in a surgical field. The multi-component tissue 2150 may consist of an outer tissue layer 2152 that has a refractive index n1 and a buried tissue, such as a blood vessel that has a vessel wall 2156. The vessel wall 2156 can be characterized by a refractive index n2. Blood can flow into the lumen of blood vessel 2160. In some respects, it may be important during a surgical procedure to determine the position of blood vessel 2160 below surface 2154 of the outer tissue layer 2152 and to characterize blood flow using Doppler techniques .
[0323] [0323] An incident laser light 2170a can be used to probe the blood vessel 2160 and can be directed at the top surface 2154 of the outer tissue layer 2152. A portion 2172 of incident laser light 2170a can be reflected on the top surface 2154 Another portion 2170b of the incident laser light 2170a can penetrate the outer tissue layer 2152. The reflected portion 2172 on the top surface 2154 of the external tissue layer 2152 has the same path length as the incident light 2170a and therefore has the same wavelength and phase of incident light 2170a. However, the portion 2170b of light transmitted into the outer tissue layer 2152 will have a transmission angle that differs from the angle of incidence of light that falls on the tissue surface due to the fact that the outer tissue layer 2152 has an index refractive index n1 which differs from the air refractive index.
[0324] [0324] If the portion of light transmitted through the outer tissue layer 2152 falls on a second surface of tissue 2158, for example, from the blood vessel wall 2156, a certain portion 2174a, b of light will be reflected back to the light source incident 2170a. The light reflected in this way 2174a at the interface between the outer tissue layer 2152 and the blood vessel wall 2156 will have the same wavelength as the incident light 2170a, but will undergo phase shift due to the change in the length of the light path. Projecting the reflected light 2174a, b from the interface between the outer tissue layer 2152 and the blood vessel wall 2156 together with the light incident on the sensor will produce an interference pattern based on the phase difference between the two light sources.
[0325] [0325] Additionally, a portion of incident light 2170c can be transmitted through the wall of blood vessel 2156 and penetrate the lumen of blood vessel 2160. That portion of incident light 2170c can interact with blood cells moving in the lumen of blood vessel 2160 and can be reflected back from 2176a to 2176c towards the incident light source which has a wavelength that has undergone the Doppler effect according to the speed of the blood cells, as shown above. The reflected light that has passed through the Doppler 2176a to 2176c effect from the moving blood cells can be projected together with the light incident on the sensor, resulting in an interference pattern that has a border pattern based on the difference in wavelength between the two light sources.
[0326] [0326] In Figure 28, a light path 2178 of light is shown that falls on the erythrocytes in the lumen of the blood vessel 2160 if there is no change in the refractive index between the light emitted and the light reflected by the moving blood cells. In this example, only a Doppler effect on the wavelength of the reflected light can be detected. However, the light reflected by the blood cells (2176a to 2176c) can incorporate phase changes due to the variation in the refractive indexes of the tissue in addition to changes in the wavelength due to the Doppler effect.
[0327] [0327] Thus, it can be understood that, if the light sensor receives the incident light, the reflected light from one or more tissue interfaces (2172 and 2174a, b) and the light that underwent Doppler effect from the blood cells (2176a to 2176c), the interference pattern produced, thus, in the light sensor can include the effects due to the Doppler effect (change in the wavelength) as well as the effects due to the change in the refractive index within the tissue ( phase variation). As a result, a Doppler analysis of the light reflected by the tissue sample can produce erroneous results if the effects due to changes in the refractive index within the sample are not compensated.
[0328] [0328] Figure 29 illustrates an example of the effects on a Doppler analysis of light falling on a tissue sample 2250 to determine the depth and location of an underlying blood vessel. If there is no intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due mainly to the change in the wavelength reflected from the moving blood cells. As a result, a 2252 spectrum derived from the interference pattern can generally reflect only the Doppler effect of blood cells. However, if there is intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due to a combination of the change in the wavelength reflected from the moving blood cells and the phase shift due to refractive index of the intervening tissue. A spectrum 2254 derived from such an interference pattern can result in the calculation of the Doppler effect that is confused due to the additional phase shift in the reflected light. In some respects, if information regarding the characteristics (thickness and refractive index) of the intervening tissue is known, the resulting spectrum 2256 can be corrected to provide a more accurate calculation of the change in wavelength.
[0329] [0329] It is recognized that the depth of light penetration into the tissue depends on the wavelength of the light used. In this way, the wavelength of light from the laser source can be chosen to detect particle movement (such as blood cells) in a specific range of tissue depth. Figures 30A to 30C schematically represent a means for detecting moving particles, such as blood cells, at a variety of tissue depths based on the wavelength of the laser light. As illustrated in Figure 30A, a laser source 2340 can direct an incident beam of laser light 2342 to a surface 2344 of a surgical site. A 2346 blood vessel (such as a vein or artery) can be disposed within the 2348 tissue at a certain depth δ from the tissue surface. The depth of penetration 2350 of a laser into a 2348 tissue may depend at least in part on the wavelength of the laser. In this way, laser light that has a wavelength in the red range of about 635 nm to about 660 nm can penetrate 2351a tissue at a depth of about 1 mm. Laser light that has a wavelength in the green range of about 520 nm to about 532 nm can penetrate 2351b tissue at a depth of about 2 to 3 mm. Laser light that has a wavelength in the blue range of about 405 nm to about 445 nm can penetrate 2351c tissue to a depth of about 4 mm or more. In the example shown in Figures 30A to 30C, a 2346 blood vessel may be located at a depth δ of about 2 to 3 mm below the tissue surface. The red laser light will not penetrate to this depth and, therefore, will not detect blood cells that flow inside this vessel. However, both green and blue laser lights can penetrate to that depth. Therefore, green and blue laser light scattered from blood cells within blood vessel 2346 can demonstrate a Doppler effect on the wavelength.
[0330] [0330] Figure 30B illustrates how a Doppler 2355 effect on the wavelength of reflected laser light can appear. The emitted light (or light from the 2342 laser source) that strikes a tissue surface 2344 may have a central wavelength 2352. For example, light from a green laser may have a central wavelength 2352 within a range from about 520 nm to about 532 nm. The reflected green light can have a central wavelength 2354 shifted to a longer wavelength (shifted to red) if the light has been reflected from a particle it was like an erythrocyte moving in the opposite direction to the detector. The difference between the central wavelength 2352 of the emitted laser light and the central wavelength 2354 of the emitted laser light comprises the Doppler effect
[0331] [0331] As shown above with respect to Figures 28 and 29, laser light reflected from structures within a 2348 fabric may also show a phase shift in reflected light due to changes in the refractive index resulting from changes in structure or tissue composition. The emitted light (or light from the 2342 laser source) that strikes a fabric surface 2344 may have a first 2356 phase characteristic. The reflected laser light may have a second 2358 phase characteristic. It can be recognized that the blue laser light which can penetrate tissue to a depth of about 4 mm or more 2351c can find a greater variety of tissue structures than red laser light (about 1 mm 2351a)
[0332] [0332] Figure 30D illustrates aspects of tissue illumination by red laser light 2360a, green 2360b and blue 2360c in a sequential manner. In some respects, a fabric can be scanned by red laser lighting 2360a, green 2360b and blue 2360c sequentially. In some alternative examples, one or more combinations of red laser light 2360a, green 2360b and blue 2360c, as shown in Figures 23D to 23F and disclosed above, can be used to illuminate the fabric according to a defined lighting sequence. Figure 30D illustrates the effect of such lighting on a CMOS imaging sensor 2362a through 2362d over time. Thus, in a first time t1, the CMOS sensor 2362a can be illuminated by the red laser 2360a. In a second time t2, the CMOS sensor 2362b can be illuminated by the green laser 2360b. In a third time t3, the CMOS sensor 2362a can be illuminated by the blue laser 2360a. The illumination cycle can then be repeated starting at a quarter of the time t4 in which the CMOS sensor 2362d can be illuminated by the red laser 2360a again. It can be recognized that sequential illumination of the tissue by laser illumination at different wavelengths can allow Doppler analysis at different depths of tissue over time. Although red laser sources 2360a, green 2360b and blue 2360c can be used to illuminate the surgical site, it can be recognized that other wavelengths outside visible light (such as in the infrared or ultraviolet regions) can be used to illuminate the surgical site. for Doppler analysis.
[0333] [0333] Figure 31 illustrates an example of a use of Doppler imaging to detect the presence of blood vessels that are not otherwise visible at a 2600 surgical site. In Figure 31, a surgeon may wish to remove a 2602 tumor found in the right upper posterior lobe 2604 of a lung. Due to the fact that the lungs are highly vascular, care must be taken to identify only the blood vessels associated with the tumor and to seal only those vessels without compromising blood flow to the unaffected portions of the lung. In Figure 31, the surgeon identified the 2606 margin of tumor 2604. The surgeon can then cut an initial dissected area 2608 on the edge of region 2606 and the exposed blood vessels 2610 can be seen for cutting and sealing. The Doppler 2620 imaging detector can be used to locate and identify unobservable 2612 blood vessels in the dissected area. An imaging system can receive data from the Doppler 2620 imaging detector for analysis and visualization of data obtained from the surgical site 2600. In some respects, the imaging system may include a screen to illustrate the surgical site 2600 that includes an image visible from surgical site 2600 along with an overlay image of blood vessels 2612 hidden in the image from surgical site 2600.
[0334] [0334] In the scenario presented above in relation to Figure 31, a surgeon wants to cut blood vessels that supply oxygen and nutrients to a tumor while sparing blood vessels associated with cancerous non-tissue. In addition, blood vessels can be arranged at different depths at or around surgical site 2600. The surgeon must therefore identify the position (depth) of the blood vessels and determine whether they are suitable for resection. Figure 32 illustrates a method for identifying deep blood vessels based on a Doppler effect of light from blood cells flowing through them.
[0335] [0335] Figure 32 represents the Doppler effect of laser light reflected from a blood vessel at a specific depth below a surgical site. The site can be illuminated by red laser light, green laser light and blue laser light. The central wavelength 2630 of the illumination light can be normalized to a relative center 3631. If the blood vessel resides at a depth of 4 or more mm below the surface of the surgical site, neither the red laser light nor the green laser light will be reflected by the blood vessel. Consequently, the central wavelength 2632 of reflected red light and the central wavelength 2634 of reflected green light will not be much different from the central wavelength 2630 of red light or green illumination light, respectively. However, if the site is illuminated by blue laser light, the central wavelength 2638 of reflected blue light 2636 will be different from the central wavelength 2630 of blue illuminator light. In some cases, the amplitude of the reflected blue light 2636 can also be significantly reduced from the amplitude of the blue illumination light. A surgeon can then determine the presence of a deep blood vessel along with its approximate depth, thereby avoiding the deep blood vessel during dissection of the surface tissue.
[0336] [0336] Figures 33 and 34 schematically illustrate the use of laser sources with different central wavelengths (colors) to determine the approximate depth of a blood vessel under the surface of a surgical site. Figure 33 represents a first surgical site 2650 that has a surface 2654 and a blood vessel 2656 disposed below surface 2654. In one method, blood vessel 2656 can be identified based on a Doppler effect of the light that affects the flow 2658 of blood cells within the 2656 blood vessel. The surgical site 2650 can be illuminated by the light of several lasers 2670, 2676, 2682, each laser being characterized by emitting light in one of several different central wavelengths. As noted above, illumination by a 2670 red laser can only penetrate the fabric up to about 1 mm. Thus, if blood vessel 2656 is located at a depth of less than 1 mm 2672 below surface 2654, the illumination of the red laser 2674 would be reflected and a Doppler effect of the reflected red illumination 2674 could be determined. In addition, as noted above, illumination by a 2676 green laser can only penetrate the fabric approximately 2 to 3 mm. If blood vessel 2656 is located at a depth of about 2 to 3 mm 2678 below the surface 2654, the illumination of the green laser would be reflected 2680 whereas the illumination of the red laser 2670 would not be, and a Doppler effect of the reflected green illumination 2680 can be determined. However, as shown in Figure 33, blood vessel 2656 is situated at a depth of about 4 mm 2684 below surface 2654. Therefore, neither the illumination of the red laser 2670 nor the illumination of the green laser 2676 would be reflected. Instead, only the illumination of the blue laser would be reflected 2686 and a Doppler effect of the reflected blue light 2686 can be determined.
[0337] [0337] In contrast to blood vessel 2656 depicted in Figure 33, blood vessel 2656 'shown in Figure 34 is located closest to the tissue surface at the surgical site. Blood vessel 2656 'can also be distinguished from blood vessel 2656 in the sense that blood vessel 2656' is illustrated as having a much thicker wall 2657. Thus, blood vessel 2656 'may be an example of an artery, while the blood vessel 2656 may be an example of a vein because the arterial walls are known to be thicker than the venous walls. In some instances, the arterial walls may be about 1.3 mm thick. As disclosed above, the illumination of the 2670 'red laser can penetrate the tissue to a depth of about 1 mm 2672'. Thus, even if a 2656 'blood vessel is exposed at a surgical site (see 2610 in Figure 31), the red laser light that is reflected 2674' from the surface of the 2656 'blood vessel may not be able to visualize the flow blood vessel 2658 'within blood vessel 2656' under Doppler analysis due to the thickness of the blood vessel 2657 wall. However, as revealed above, the green laser light 2676 'which falls on the surface of a tissue can penetrate to a depth of about 2 to 3 mm 2678 '. In addition, the blue laser light 2682 'that falls on the surface of a fabric can penetrate to a depth of about 4 mm 2684'. Consequently, the green laser light can be reflected 2680 'from the blood cells flowing 2658' into the blood vessel 2656 ', and the blue laser light can be reflected 2686' from the blood cells flowing 2658 'into the blood vessel. 2656 '. As a result, a Doppler analysis of reflected green light 2680 ’and reflected blue light 2686’ can provide information about blood flow in blood vessels near the surface, especially in the approximate depth of the blood vessel.
[0338] [0338] As shown above, the depth of blood vessels below the surgical site can be probed based on wavelength-dependent Doppler imaging. The amount of blood flow through such a blood vessel can also be determined by speckle contrast (interference) analysis. The Doppler effect can indicate a particle moving in relation to a stationary light source. As revealed above, the Doppler wavelength shift can be an indication of the speed of the particle's movement. Individual particles such as blood cells may not be observable separately. However, the speed of each blood cell will produce a proportional Doppler effect. An interference pattern can be generated by combining the backscattered light from multiple blood cells due to differences in the Doppler effect of the backscattered light from each of the blood cells. The interference pattern can be an indication of the numerical density of blood cells within a display frame. The interference pattern can be called speckle contrast. Speckle contrast analysis can be calculated using a 300 x 300 full-frame CMOS imaging matrix, and speckle contrast can be directly related to the amount of moving particles (eg, blood cells) interacting with laser light over a given exposure period.
[0339] [0339] A CMOS image sensor can be coupled to a digital signal processor (PSD). Each pixel of the sensor can be multiplexed and digitized. The Doppler effect on light can be analyzed by looking at the laser light source compared to the light that has undergone the Doppler effect. A larger Doppler effect and speckle may be related to a greater number of blood cells and their velocity in the blood vessel.
[0340] [0340] Figure 35 represents an aspect of a 2800 composite visual display that can be presented to a surgeon during a surgical procedure. The composite visual display 2800 can be constructed by superimposing a white light image 2830 from the surgical site with an image from Doppler analysis 2850.
[0341] [0341] In some respects, the white light image 2830 may depict surgical site 2832, one or more surgical incisions 2834 and tissue 2836 readily visible within surgical incision 2834. The white light image 2830 can be generated by illumination 2840 of surgical site 2832 with a white light source 2838 and reception of reflected white light 2842 by an optical detector. Although a 2838 white light source can be used to illuminate the surgical site surface, in one aspect, the surgical site surface can be visualized using suitable combinations of red 2854, green 2856 and blue 2858 laser light, as revealed above with respect to Figures 23C to 23F.
[0342] [0342] In some respects, the 2850 Doppler analysis image may include information about the depth of the blood vessel together with information about the 2852 blood flow (from the speckle analysis). As revealed above, blood vessel depth and blood flow velocity can be obtained by illuminating the surgical site with multiple wavelength laser light and by determining the blood vessel depth and blood flow based on the depth of known penetration of light of a specific wavelength. In general, the surgical site 2832 can be illuminated by the light emitted by one or more lasers such as a red laser 2854, a green laser 2856 and a blue laser 2858. A CMOS detector 2872 can receive reflected light back (2862, 2866 , 2870) from surgical site 2832 and its surrounding tissue. The 2850 Doppler analysis image can be constructed 2874 based on an analysis of the multiple pixel data of the CMOS detector 2872.
[0343] [0343] In one aspect, a red laser 2854 can emit red laser illumination 2860 over surgical site 2832 and reflected light 2862 can reveal minimal surface or subsurface structures. In one aspect, a green laser 2856 can emit green laser illumination 2864 over surgical site 2832 and reflected light 2866 can reveal characteristics of a deeper subsurface. In one aspect, a blue laser 2858 can emit blue laser lighting 2868 over surgical site 2832 and reflected light 2870 can reveal, for example, a blood flow within the deepest vascular structures. In addition, speckle contrast analysis can provide the surgeon with information regarding the amount and speed of blood flow through the deepest vascular structures.
[0344] [0344] Although not shown in Figure 35, it can be understood that the imaging system can also illuminate the surgical site with light outside the visible range. This light can include infrared light and ultraviolet light. In some respects, sources of infrared or ultraviolet light may include broadband wavelength sources (such as a tungsten source, a tungsten-halogen source or a deuterium source). In some other respects, infrared or ultraviolet light sources may include narrow-band wavelength sources (diode IV lasers, UV gas lasers or dye lasers).
[0345] [0345] Figure 36 is a 2900 flow chart of a method for determining the depth of a surface feature in a piece of fabric. An image capture system can illuminate 2910 a fabric with a first beam of light having a first center frequency and receive 2912 a first reflected light from the fabric illuminated by the first beam of light. The image capture system can then calculate 2914 a first Doppler effect based on the first beam of light and the first reflected light. The image capture system can then illuminate the fabric 2916 with a second beam of light having a second central frequency and receive 2918 a second light reflected from the fabric illuminated by the first beam of light. The image capture system can then calculate a second 2920 Doppler effect based on the second beam of light and the second reflected light. The image capture system can then calculate 2922 a depth of a tissue resource based in part on the first central wavelength, the first Doppler effect, the second central wavelength and the second Doppler effect. In some respects, tissue resources can include the presence of moving particles, such as blood cells moving within a blood vessel, and a direction and speed of the flow of moving particles. It can be understood that the method can be extended to include lighting the fabric by any one or more additional light beams. In addition, the system can calculate an image comprising a combination of an image of the fabric surface and an image of the structure arranged within the fabric.
[0346] [0346] In some ways, multiple visual displays can be used. For example, a 3D screen can provide a composite image that displays the combined white light (or a suitable combination of red, green and blue laser light) and laser Doppler imaging. Additional screens may provide only the white light display or a screen showing a composite white light display and a NIRS display to view only the tissue's blood oxygen response. However, displaying the NIRS may not be necessary at each cycle which allows for a tissue response. Characterization of subsurface tissue using multispectral TCO
[0347] [0347] During a surgical procedure, the surgeon can employ "smart" surgical devices for manipulating tissue. Such devices can be considered "intelligent" in the sense that they include resources to direct, control and / or vary the actions of the devices based on parameters relevant to their uses. The parameters can include the type and / or composition of the fabric being handled. If the type and / or composition of the tissue being manipulated are unknown, the actions of the smart devices may be inappropriate for the tissue being manipulated. As a result, tissues may be damaged or tissue manipulation may be ineffective due to improper settings of the smart device.
[0348] [0348] The surgeon may attempt to manually vary the parameters of the smart device in a trial and error manner, resulting in an inefficient and time-consuming surgical procedure.
[0349] [0349] Therefore, it is desirable to have a surgical visualization system that can probe tissue structures underlying a surgical site to determine its structural and compositional characteristics and provide such data to the smart surgical instruments that are used in a surgical procedure.
[0350] [0350] Some aspects of the present description also provide a control circuit configured to control the lighting of a surgical site using one or more light sources, such as laser light sources, and to receive imaging data from one or more sensors of image. In some respects, the present description provides a non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a device to characterize structures below the surface in a surgical site and determine the depth of the structures below the surface of the fabric.
[0351] [0351] In some respects, a surgical imaging system can comprise a plurality of lighting sources,
[0352] [0352] In one aspect, a surgical system can include multiple sources of laser light and can receive laser light reflected from a tissue. The light reflected from the fabric can be used by the system to calculate the surface characteristics of the components arranged within the fabric. The characteristics of the components arranged within the fabric can include a composition of the components and / or a metric related to surface irregularities of the components.
[0353] [0353] In one aspect, the surgical system can transmit data related to the composition of the components and / or metrics related to surface irregularities of the components to a second instrument to be used in the tissue to modify the control parameters of the second instrument.
[0354] [0354] In some respects, the second device may be an advanced energy device, and modifications to the control parameters may include a clamp pressure, an operating power level, an operating frequency and a transducer signal amplitude.
[0355] [0355] As revealed above, blood vessels can be detected under the surface of a surgical site based on the Doppler effect in the light reflected by the blood cells that move within the blood vessels.
[0356] [0356] Laser Doppler flowmetry can be used to visualize and characterize a flow of particles that move in relation to an effectively stationary background. In this way, laser light scattered by moving particles, such as blood cells, may have a different wavelength than that of the original laser source of illumination. In contrast, the laser light scattered over the effectively stationary background (for example, vascular tissue) may have a wavelength equal to that of the original laser light source. The change in the wavelength of the light spread from the blood cells can reflect both the direction of blood cell flow relative to the laser source and the speed of the blood cell. As previously disclosed, Figures 26A to 26C illustrate the change in wavelength of light scattered from blood cells that may be moving away from (Figure 26A) or towards (Figure 26C) the laser light source.
[0357] [0357] In each of Figures 26A to 26C, the original illumination light 2502 is represented with a relative central wavelength of 0. It can be seen from Figure 26A that light scattered from moving blood cells in the opposite direction to the laser source 2504 it has a wavelength shifted by a certain amount 2506 to a wavelength greater than that of the laser source (and is therefore redshifted). It can also be seen from Figure 24C that the light scattered from blood cells moving towards the laser source 2508 has a wavelength shifted by a certain amount 2510 to a shorter wavelength in relation to the source laser (and is therefore blue shifted). The amount of wavelength shift (for example, 2506 or 2510) may depend on the rate at which blood cells move. In some respects, the amount of redshift 2506 of some blood cells may be approximately the same amount of blue shift 2510 of some other blood cells. Alternatively, the amount of redshift 2506 of some blood cells may differ from the amount of blue shift 2510 of some other blood cells. Thus, the speed of blood cells flowing in the opposite direction to the laser source, as shown in Figure 24A, may be less than the speed of blood cells flowing towards the laser source, as shown in Figure 26C, based on in the relative magnitude of the wavelength shifts 2506 and 2510. In contrast, and as depicted in Figure 26B, light scattered from tissue that does not move in relation to the laser light source (for example, 2512 blood vessels or tissue non-vascular 2514) may not show any change in wavelength.
[0358] [0358] As previously revealed, Figure 27 represents an aspect of the 2530 instrumentation that can be used to detect a Doppler effect in laser light scattered from portions of a 2540 tissue. 2534 light from a 2532 laser can pass through a dichroic beam 2544. A portion of the laser light 2536 can be transmitted through the dichroic beam 2544 and can illuminate the tissue
[0359] [0359] It can be recognized that the 2542 backscattered light from the 2540 fabric may also include backscattered light from the boundary layers within the 2540 fabric and / or specific wavelength light absorption by the material within the 2540 fabric. As a result, the interference pattern observed in the 2550 detector can incorporate interference edge features of these additional optical effects and can therefore confuse the calculation of the Doppler effect if not properly analyzed.
[0360] [0360] It can be recognized that the light reflected from the fabric may also include light back-scattered from the boundary layers within the fabric and / or absorption of specific wavelength light by the material within the fabric. As a result, the interference pattern observed in the detector can incorporate edge features that can confuse the calculation of the Doppler effect if not properly analyzed.
[0361] [0361] As previously revealed, Figure 28 shows some of these additional optical effects. It is well known that the light that travels through a first optical medium that has a first refractive index, n1, can be reflected in an interface with a second optical medium that has a second refractive index, n2. The light transmitted through the second optical medium will have a transmission angle in relation to the interface that differs from the angle of the incident light based on a difference between the refractive indices n1 and n2 (Snell's law). Figure 26 illustrates the effect of Snell's law on light that strikes the surface of a 2150 multi-component tissue, as can be seen in a surgical field. The multi-component tissue 2150 may consist of an outer tissue layer 2152 that has a refractive index n1 and a buried tissue, such as a blood vessel that has a vessel wall 2156. The vessel wall 2156 can be characterized by a refractive index n2. Blood can flow into the lumen of the blood vessel
[0362] [0362] An incident laser light 2170a can be used to probe blood vessel 2160 and can be directed at the top surface 2154 of the outer tissue layer 2152. A portion 2172 of incident laser light 2170a can be reflected on the top surface 2154 Another portion 2170b of the incident laser light 2170a can penetrate the outer tissue layer 2152. The reflected portion 2172 on the top surface 2154 of the external tissue layer 2152 has the same path length as the incident light 2170a and therefore has the same wavelength and phase of incident light 2170a. However, the portion 2170b of light transmitted into the outer tissue layer 2152 will have a transmission angle that differs from the angle of incidence of light that falls on the tissue surface due to the fact that the outer tissue layer 2152 has an index refractive index n1 which differs from the air refractive index.
[0363] [0363] If the portion of light transmitted through the outer tissue layer 2152 falls on a second surface of tissue 2158, for example, from the blood vessel wall 2156, a certain portion 2174a, b of light will be reflected back to the light source incident 2170a. The light reflected in this way 2174a at the interface between the outer tissue layer 2152 and the blood vessel wall 2156 will have the same wavelength as the incident light 2170a, but will undergo phase shift due to the change in the length of the light path. Projecting the reflected light 2174a, b from the interface between the outer tissue layer 2152 and the blood vessel wall 2156 together with the light incident on the sensor will produce an interference pattern based on the phase difference between the two light sources.
[0364] [0364] Additionally, a portion of the incident light 2170c can be transmitted through the blood vessel wall 2156 and penetrate the lumen of the blood vessel 2160. That portion of the incident light 2170c can interact with the blood cells moving in the lumen of the blood vessel 2160 and can be reflected back from 2176a to 2176c towards the incident light source which has a wavelength that has undergone the Doppler effect according to the speed of the blood cells, as shown above. The reflected light that has passed through the Doppler 2176a to 2176c effect from the moving blood cells can be projected together with the light incident on the sensor, resulting in an interference pattern that has a border pattern based on the difference in wavelength between the two light sources.
[0365] [0365] In Figure 28, a light path 2178 of light is shown that falls on erythrocytes in the lumen of blood vessel 2160 if there is no change in the refractive index between the light emitted and the light reflected by the moving blood cells. In this example, only a Doppler effect on the wavelength of the reflected light can be detected. However, the light reflected by blood cells 2176a to 2176c can incorporate phase changes due to the variation in tissue refraction indices in addition to changes in wavelength due to the Doppler effect.
[0366] [0366] Thus, it can be understood that, if the light sensor receives the incident light, the reflected light from one or more tissue interfaces 2172 and 2174a, bea light that underwent Doppler effect from blood cells 2176a to 2176c, the interference pattern produced, thus, in the light sensor can include the effects due to the Doppler effect (change in the wavelength) as well as the effects due to the change in the refractive index within the tissue (phase variation). As a result, a Doppler analysis of the light reflected by the tissue sample can produce erroneous results if the effects due to changes in the refractive index within the sample are not compensated.
[0367] [0367] As previously revealed, Figure 29 illustrates an example of the effects on a Doppler analysis of light falling on a tissue sample 2250 to determine the depth and location of an underlying blood vessel. If there is no intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due mainly to the change in the wavelength reflected from the moving blood cells. As a result, a 2252 spectrum derived from the interference pattern can generally reflect only the Doppler effect of blood cells. However, if there is intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due to a combination of the change in the wavelength reflected from the moving blood cells and the phase shift due to refractive index of the intervening tissue. A spectrum 2254 derived from such an interference pattern can result in the calculation of the Doppler effect that is confused due to the additional phase shift in the reflected light. In some respects, if information regarding the characteristics (thickness and refractive index) of the intervening tissue is known, the resulting spectrum 2256 can be corrected to provide a more accurate calculation of the change in wavelength.
[0368] [0368] It can be recognized that the phase shift in reflected light from a tissue can provide additional information regarding the underlying tissue structures, regardless of Doppler effects.
[0369] [0369] Figure 37 illustrates that the location and characteristics of non-vascular structures can be determined based on the phase difference between incident light 2372 and reflected light from deep tissue structures 2374, 2376, 2378. As noted above, the depth of penetration of the light falling on a fabric is dependent on the wavelength of the incident illumination. Red laser light (which has a wavelength in the range of about 635 nm to about 660 nm) can penetrate tissue at a depth of about 1 mm. Green laser light (which has a wavelength in the range of about 520 nm to about 532 nm) can penetrate the tissue at a depth of about 2 to 3 mm. Blue laser light (which has a wavelength in the range of about 405 nm to about 445 nm) can penetrate tissue to a depth of about 4 mm or more. In one aspect, an interface 2381a between two fabrics that differ in the refractive index that is situated less than or about 1 mm below a 2380 fabric surface may reflect 2374 red, green or blue laser light. The phase of reflected light 2374 can be compared to incident light 2372 and thus the difference in the tissue refractive index at the interface 2381a can be determined. In another aspect, an interface 2381b between two fabrics that differ in the refractive index that is between 2 and 3 mm 2381b below a 2380 fabric surface may reflect 2376 green or blue laser light, but not red light. The phase of reflected light 2376 can be compared to incident light 2372 and thus the difference in the tissue refractive index at interface 2381b can be determined. In yet another aspect, an interface 2381c between two fabrics that differ in the refractive index that is between 3 and 4 mm 2381c below a fabric surface 2380 may reflect 2378 only blue laser light, but not red or green light. The phase of reflected light 2378 can be compared to incident light 2372 and, thus, the difference in the tissue refractive index at the interface 2381c can be determined.
[0370] [0370] The measurement of phase interference from a light-illuminated fabric with different wavelengths can therefore provide information on the relative refractive indices of the reflective fabric as well as on the depth of the fabric. The refractive indices of the tissue can be evaluated using multiple laser sources and their intensity, and thus the relative refractive indices for the tissue can be calculated. It is recognized that different fabrics can have different refractive indices. For example, the refractive index may be related to the relative composition of collagen and elastin in a tissue or the amount of hydration in the tissue. Therefore, a technique for measuring the relative refractive index of the tissue can result in the identification of a tissue composition.
[0371] [0371] In some respects, smart surgical instruments include algorithms to determine the parameters associated with the function of the instruments. A non-limiting example of such parameters may be the pressure of an anvil against a fabric from an intelligent stapling device. The amount of pressure of an anvil against a tissue may depend on the type and composition of the tissue. For example, less pressure may be needed to staple a highly compressive fabric, while more pressure may be required to staple a less compressive fabric. Another non-limiting example of a parameter associated with an intelligent surgical device may include a firing rate of an I-beam beam knife to cut the tissue. For example, a rigid fabric may require more strength and a slower cut rate than a less rigid fabric. Another non-limiting example of such parameters may be the amount of current supplied to an electrode in a smart RF cauterization or sealing device. The composition of the fabric, as a percentage of tissue hydration, can determine the amount of current required to heat seal the fabric. Yet another non-limiting example of such parameters may be the amount of energy supplied to an ultrasonic transducer of an intelligent ultrasonic cutting device or the frequency of activation of the cutting device. A rigid fabric may require more energy to be cut and contacting the ultrasonic cutting tool with a rigid fabric can shift the cutter's resonant frequency.
[0372] [0372] It can be recognized that a tissue visualization system that can identify the type and depth of the tissue can provide this data to one or more intelligent surgical devices. The identification and location data can then be used by smart surgical devices to adjust one or more of its operational parameters, thus enabling them to optimize their tissue manipulation. It can be understood that an optical method to characterize a type of tissue can allow the automation of the operational parameters of smart surgical devices. Such automation of the operation of intelligent surgical instruments may be preferable to depending on the estimate of humans to determine the operational parameters of the instruments.
[0373] [0373] In one aspect, optical coherence tomography (TCO) is a technique that can visualize subsurface tissue structures based on the phase difference between an illuminating light source and the light reflected by the structures located within the tissue. Figure 38 schematically represents an example of 2470 instrumentation for optical coherence tomography. In Figure 38, a laser source 2472 can emit light 2482 according to any optical wavelength of interest (red, green, blue, infrared or ultraviolet). Light 2482 can be directed to a dichroic beam 2486. The dichroic beam 2486 directs a portion of light 2488 to a tissue sample 2480. The dichroic beam 2486 can also direct a portion of light 2492 to a stationary reference mirror
[0374] [0374] As revealed above, depth information about subsurface tissue structures can be determined from a combination of laser light wavelength and the reflected light phase of a deep tissue structure. Additionally, the inhomogeneity of the surface of a local tissue can be determined by comparing the phase and the difference in amplitude of the reflected light from different portions of the same subsurface tissues. Measurements of a difference in the surface properties of the tissue at a defined location compared to those at a neighboring site may be indicative of adhesions, disorganization of the tissue layers, infection or a neoplasm in the tissue being probed.
[0375] [0375] Figure 39 illustrates this effect. The characteristics of the surface of a fabric determine the angle of reflection of the light that falls on the surface. A smooth surface 2551a reflects light with essentially the same spread 2544 as the light falling on surface 2542 (specular reflection). Consequently, the amount of light received by a light detector that has a known fixed aperture can effectively receive the entire amount of reflected light 2544 from the smooth surface 2551a. However, greater surface roughness on a fabric surface can result in a greater spread in reflected light compared to incident light (diffuse reflection).
[0376] [0376] A certain amount of the reflected light 2546 from a fabric surface that has a certain amount of surface irregularities 2551b will be outside the fixed opening of the light detector due to the increased spread of reflected light 2546. As a result, the light detector light will detect less light (shown in Figure 39 as a decrease in the amplitude of the reflected light signal 2546). It should be understood that the amount of reflected light propagation will increase as the surface roughness of a fabric increases. Thus, as shown in Figure 39, the amplitude of reflected light 2548 from a surface 2551c that has a significant surface roughness may have a smaller amplitude than reflected light 2544 from a smooth surface 2551a, or reflected light 2546 forms a surface that has only a moderate amount of surface roughness 2551b. Therefore, in some respects, a single laser source can be used to investigate the quality of a fabric surface or subsurface by comparing the optical properties of the reflected light from the fabric with the optical properties of the reflected light from adjacent surfaces.
[0377] [0377] In other respects, light from multiple laser sources (for example, lasers that emit light with different central wavelengths) can be used sequentially to probe the characteristics of the fabric surface at a variety of depths below the surface 2550 As revealed above (with reference to Figure 37), the absorbance profile of a laser light in a fabric depends on the central wavelength of the laser light. Laser light that has a shorter central wavelength (more blue) can penetrate tissue more deeply than laser light that has a longer central wavelength (more red). Therefore, measurements related to diffuse light reflection made at different wavelengths of light can indicate an amount of surface roughness and the depth of the surface being measured.
[0378] [0378] Figure 40 illustrates a method of displaying image processing data related to a combination of tissue visualization modalities. The data used on the screen can be derived from image phase data related to tissue layer composition, image intensity (amplitude) data related to tissue surface characteristics and image wavelength data related to tissue mobility (such as blood cell transport) and tissue depth. As an example, the light emitted by a laser in the blue 2562 optical region can fall on the blood flowing to a depth of about 4 mm below the surface of the tissue. The reflected light 2564 can be shifted to red due to the Doppler effect of blood flow. As a result, information regarding the existence of a blood vessel and its depth below the surface can be obtained.
[0379] [0379] In another example, a layer of tissue may be at a depth of about 2 to 3 mm below the surface of the surgical site. This tissue may include surface irregularities indicative of scarring or other pathologies. The red light emitted 2572 may not penetrate to a depth of 2 to 3 mm, so consequently, the reflected red light 2580 may have approximately the same amplitude as the red light emitted 2572 because it is unable to probe structures located more than 1 mm below the top surface of the surgical site. However, the green light reflected by the fabric 2578 can reveal the existence of surface irregularities at that depth in the sense that the amplitude of the reflected green light 2578 may be less than the amplitude of the green light emitted 2570. Similarly, the blue light reflected from the fabric 2574 may reveal surface irregularities at that depth in the sense that the amplitude of the reflected blue light 2574 may be less than the amplitude of the blue light emitted 2562. In an example of an image processing step, the 2582 image can be smoothed with the use of a 2584 movable window filter to reduce noise between pixels as well as to reduce small local tissue anomalies 2586 that can hide more important features 2588.
[0380] [0380] Figures 41A to 41C illustrate various aspects of displays that can be provided to a surgeon for visual identification of the surface and subsurface structures of a tissue at a surgical site. Figure 41A can represent a surface map of the surgical site with color coding to indicate structures located at different depths below the surface of the surgical site. Figure 41B shows an example of one of several horizontal cuts across the fabric at different depths, which can be color-coded to indicate depth and also include data associated with differences in tissue surface anomalies
[0381] [0381] Figure 42 is a 2950 flow chart of a method for providing information related to a tissue characteristic for an intelligent surgical instrument. An image capture system can illuminate 2960 a fabric with a first beam of light having a first central frequency and receive 2962 a first reflected light from the fabric illuminated by the first beam of light. The image capture system can then calculate 2964 a first surface feature at a first depth based on the first beam of light emitted and the first light reflected from the fabric. The image capture system can then illuminate the fabric 2966 with a second beam of light having a second central frequency and receive 2968 a second light reflected from the fabric illuminated by the first beam of light. The image capture system can then calculate a second surface feature based on a second depth in the second beam of light emitted and the second light reflected from the fabric. The characteristics of the fabric, which may include a type of fabric, a fabric composition, and a fabric surface roughness metric, can be determined from the first central light frequency, the second central light frequency, the first reflected light from of the fabric and the second light reflected from the fabric. The tissue characteristic can be used to calculate 2972 one or more parameters related to the function of an intelligent surgical instrument such as clamp pressure, energy to effect tissue cauterization or amplitude and / or frequency of current to drive a piezoelectric actuator to cut a fabric. In some additional examples, the parameter can be transmitted 2974 directly or indirectly to the intelligent surgical instrument, which can modify its operational characteristics in response to the tissue being manipulated. Minimally invasive multifocal camera
[0382] [0382] In a minimally invasive procedure, for example, laparoscopic, a surgeon can view the surgical site with the use of imaging instruments that include a light source and a camera. Imaging instruments can allow the surgeon to view the end actuator of a surgical device during the procedure. However, the surgeon may need to view the tissue in the opposite direction from the end actuator to avoid involuntary damage during surgery. Such distant tissue may be out of sight of the camera system when focused on the end actuator. The imaging instrument can be moved to change the camera's field of view, but it can be difficult to return the camera system back to its original position after it has been moved.
[0383] [0383] The surgeon may attempt to move the imaging system within the surgical site to view different portions of the site during the procedure. The repositioning of the imaging system takes time and the surgeon has no guarantee of viewing the same field of view of the surgical site when the imaging system is returned to its original location.
[0384] [0384] Therefore, it is desirable to have a medical imaging visualization system that can provide multiple fields of view from the surgical site without the need to reposition the visualization system. Medical imaging devices include, without limitation, laparoscopes, endoscopes, thoracoscopes and the like, as described in the present invention. In some respects, a single display system can display each of the multiple fields of view at the surgical site at approximately the same time. The display of each of the multiple fields of view can be independently updated depending on a display control system consisting of one or more hardware modules, one or more software modules, one or more firmware modules or any combination or combinations of the themselves.
[0385] [0385] Some aspects of the present description also provide a control circuit configured to control the lighting of a surgical site using one or more light sources, such as laser light sources, and to receive imaging data from one or more sensors of image. In some aspects, the control circuit can be configured to control the operation of one or more light sensor modules to adjust a field of view. In some respects, the present description provides a non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a device to adjust one or more components of one or more light sensor modules and process an image of each one of the one or more light sensor modules.
[0386] [0386] One aspect of a minimally invasive image capture system may comprise a plurality of light sources, where each light source is configured to emit light with a specified central wavelength, a first light detecting element that has a first field of view and is configured to receive reflected light from a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of light sources, a second light sensing element that has a second field of view and is configured to receive reflected illumination from a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of lighting sources, in which the second field of view overlaps at at least a portion of the first field of view; and a computing system.
[0387] [0387] The computing system can be configured to receive data from the first light detection element, receive data from the second light detection element, compute imaging data based on the data received from the first light detection element. light detection and data received from the second light detection element, and transmit the imaging data for reception by a display system.
[0388] [0388] A variety of surgical visualization systems have been revealed above. Such systems provide visualization of tissue and tissue substructures that can be found during one or more surgical procedures. Non-limiting examples of such systems may include: systems for determining the location and depth of the subsurface vascular tissue, such as veins and arteries; systems for determining the amount of blood flowing through the subsurface vascular tissue; systems for determining the depth of non-vascular tissue structures; systems for characterizing the composition of such non-vascular tissue structures; and systems for characterizing one or more surface characteristics of such tissue structures.
[0389] [0389] It can be recognized that a single surgical visualization system can incorporate components from any one or more of these visualization modalities. Figures 22A to 22D represent some examples of such a surgical visualization system
[0390] [0390] As shown above, in a non-limiting aspect, a 2108 surgical display system can include a 2002 imaging control unit and a 2020 hand unit. The 2020 hand unit can include a 2021 body, a scope scope with camera 2015 attached to the body 2021 and a probe with an elongated camera
[0391] [0391] Alternatively, the illumination of the surgical site may be cyclical between the visible light sources, as shown in Figure 30D. In some examples, light sources may include any one or more of a red 2360a laser, a green 2360b laser or a blue 2360c laser. In some non-limiting examples, a 2360a red laser light source can generate illumination that has a peak wavelength that can vary between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm or any value or range of values between them. In some non-limiting examples, a 2360b green laser light source can generate illumination that has a peak wavelength that can range from 520 nm to 532 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 520 nm, about 522 nm, about 524 nm, about 526 nm, about 528 nm, about 530 nm, about 532 nm or any value or range of values between them. In some non-limiting examples, the 2360c blue laser light source can generate illumination that has a peak wavelength that can vary between 405 nm and 445 nm, inclusive. Non-limiting examples of a blue laser peak wavelength may include about 405 nm, about 410 nm, about 415 nm, about 420 nm, about 425 nm, about 430 nm, about 435 nm, about 440 nm, about 445 nm or any value or range of values between them.
[0392] [0392] Additionally, the illumination of the surgical site can be cyclic to include non-visible light sources that can provide infrared or ultraviolet lighting. In some non-limiting examples, an infrared laser light source can generate illumination that has a peak wavelength that can range between 750 nm and 3000 nm, inclusive. Non-limiting examples of a peak infrared laser wavelength may include about 750 nm, about 1,000 nm, about 1,250 nm, about 1,500 nm, about
[0393] [0393] The sensor matrix outputs under different illumination wavelengths can be combined to form the image
[0394] [0394] Figure 44A represents the distal end of a 2120 flexible elongated camera probe with a flexible camera drive shaft 2122 and a single light sensor module 2124 disposed at the distal end 2123 of the flexible camera probe drive shaft 2122. In some non-limiting examples, the drive shaft of the 2122 flexible camera probe can have an outside diameter of about 5 mm. The outside diameter of the drive shaft of the flexible camera probe 2122 may depend on geometric factors that may include, without limitation, the amount of flexion allowed on the drive shaft at the distal end 2123. As shown in Figure 44A, the distal end 2123 of the drive shaft of the flexible camera probe 2122 can bend about 90 ° with respect to a longitudinal geometric axis of a non-curved portion of the drive shaft of the flexible camera probe 2122 located at a proximal end of the probe with an extended camera 2120. It may be that the distal end 2123 of the drive shaft of the flexible camera probe 2122 may bend in any suitable amount as may be necessary for its function. Thus, as non-limiting examples, the distal end 2123 of the drive shaft of the flexible camera probe 2122 can bend in any amount between about 0 ° and about 90 °. Non-limiting examples of the angle of flexion of the distal end 2123 of the drive shaft of the flexible camera probe 2122 may include about 0 °, about 10 °, about 20 °, about 30 °,
[0395] [0395] The single light sensor module 2124 can receive reflected light from the fabric when illuminated by light emitted by one or more light sources 2126 arranged at the distal end of the probe with elongated camera. In some examples, the light sensor module 2124 may be a 4 mm sensor module, such as a 21 mm bead bezel, as shown in Figure 22D. It can be recognized that the 2124 light sensor module can be any size suitable for its intended function. Thus, the light sensor module 2124 can include a 5.5 mm bezel 2136a, a 2.7 mm bezel 2136c or a 2 mm bezel 2136d, as shown in Figure 22D.
[0396] [0396] It can be recognized that one or more 2126 light sources can include any number of 2126 light sources including, without limitation, one light source, two light sources, three light sources, four light sources or more of four lighting sources. It can be further understood that each light source can provide illumination with any central wavelength including a central red illumination wavelength, a central green illumination wavelength, a central blue illumination wavelength, a wavelength central infrared illumination, a central wavelength of ultraviolet illumination or any other wavelength. In some examples, one or more 2126 light sources may include a white light source, which can illuminate the fabric with light that has wavelengths that can cover the range of optical white light from about 390 nm to about 700 nm.
[0397] [0397] Figure 44B represents the distal end 2133 of an alternative elongated camera probe 2130 with multiple light sensor modules, for example, the two light sensor modules 2134a and 2134b, each disposed at the distal end 2133 of the probe with 2130 elongated camera. In some non-limiting examples, the alternative 2130 elongated camera probe can have an outside diameter of about 7 mm. In some examples, the light sensor modules 2134a and 2134b may each comprise a 4 mm sensor module, similar to the light sensor module 2124 in Figure 44A. Alternatively, each of the light sensor modules 2134a and 2134b may comprise a 5.5 mm light sensor module, a 2.7 mm light sensor module or a 2 mm light sensor module, as appropriate. represented in Figure 22D. In some examples, both light sensor modules 2134a and 2134b may be the same size. In some examples, the light sensor modules 2134a and 2134b may have different sizes. As a non-limiting example, an alternative 2130 elongated camera probe may have a first 4 mm light sensor and two additional 2 mm light sensors. In some ways, a visualization system can combine the optical outputs of the multiple light sensor modules 2134a and 2134b to form a 3D or near 3D image of the surgical site. In some other respects, the outputs of the multiple light sensor modules 2134a and 2134b can be combined in order to improve the optical resolution of the surgical site, which may otherwise not be practical with just a single light sensor module.
[0398] [0398] Each of the multiple light sensor modules 2134a and 2134b can receive reflected light from the fabric when illuminated by a light emitted by one or more light sources 2136a and 2136b arranged at the distal end 2133 of the probe with elongated alternative camera 2130. In some non-limiting examples, the light emitted by all light sources 2136a and 2136b can be derived from the same light source (such as a laser). In other non-limiting examples, light sources 2136a surrounding a first light sensor module 2134a can emit light at a first wavelength and light sources 2136b surrounding a second light sensor module 2134b can emit light at a second wavelength. It can be further understood that each light source 2136a and 2136b can provide illumination with any central wavelength including a central red illumination wavelength, a green illumination central wavelength, a blue illumination central wavelength, a central wavelength of infrared illumination, a central wavelength of ultraviolet illumination or any other wavelength. In some examples, the one or more light sources 2136a and 2136b may include a white light source, which can illuminate the fabric with light that has wavelengths that can cover the range of optical white light from about 390 nm to about 700 nm.
[0399] [0399] In some additional aspects, the distal end 2133 of the elongated camera probe 2130 may include one or more working channels 2138. Such working channels 2138 may be in fluid communication with a suction port of a device for aspirating material from the surgical site, thus allowing the removal of material that can potentially obscure the field of view of the light sensor modules 2134a and 2134b. Alternatively, such 2138 working channels may be in fluid communication with a fluid source port of a device to supply fluid to the surgical site, to purge debris or material away from the surgical site. Such fluids can be used to clear material from the field of view of the light sensor modules 2134a and 2134b.
[0400] [0400] Figure 44C represents a perspective view of an aspect of a 2160 monolithic sensor that has a plurality of pixel arrays to produce a three-dimensional image according to the teachings and principles of the description. Such an implementation may be desirable for capturing three-dimensional images, in which the two pixel arrays 2162 and 2164 can be moved during use. In another implementation, a first array of pixels 2162 and a second array of pixels 2164 can be dedicated to receive a predetermined range of wavelengths of electromagnetic radiation, where the first array of pixels 2162 is dedicated to a range of electromagnetic radiation wavelength different from the second pixel array 2164.
[0401] [0401] Additional disclosures of a dual sensor matrix can be found in US Patent Application Publication No. 2014/0267655, entitled SUPER RESOLUTION AND COLOR MOTION ARTIFACT CORRECTION IN A PULSED COLOR IMAGING SYSTEM, filed on March 14, 2014, which was granted on May 2, 2017 as US patent No. 9,641,815, the contents of which are incorporated herein by reference in their entirety and for multiple purposes.
[0402] [0402] In some respects, a light sensor module may comprise a multi-pixel light sensor, such as a CMOS matrix, in addition to one or more additional optical elements, such as a lens, a reticle and a filter.
[0403] [0403] In some alternative aspects, the one or more light sensors may be located inside the 2021 body of the manual unit
[0404] [0404] The images obtained from each of the multiple light sensors, for example, 2134a and 2134b, can be combined or processed in several different ways, in combination or separately, and then displayed in order to allow a surgeon to view different aspects of the surgical site.
[0405] [0405] In a non-limiting example, each light sensor can have an independent field of view. In some additional examples, the field of view of a first light sensor may partially or completely overlap the field of view of a second light sensor.
[0406] [0406] As shown above, an imaging system may include a manual unit 2020 that has a 2024 elongated camera probe with one or more light sensor modules 2124, 2134a and 2134b arranged at its distal end 2123, 2133. As a For example, the 2024 elongated camera probe may have two light sensor modules 2134a and 2134b, although it can be recognized that there may be three, four, five or more light sensor modules at the distal end of the 2024 elongated camera probe. Figures 45 and 46A to 46D represent examples of the distal end of an elongated camera probe that has two light sensor modules, it can be recognized that the description of the operation of the light sensor modules is not limited to just two sensor modules of light. As shown in Figures 45 and 46A to 46D, the light sensor modules can include an image sensor, such as a CCD or CMOS sensor, which can be composed of an array of light sensing elements (pixels). The light sensor modules can also include additional optical elements, such as lenses. Each lens can be adapted to provide a field of view for the light sensor of the respective light sensor module.
[0407] [0407] Figure 45 represents a general view of a distal end 2143 of an elongated camera probe that has multiple light sensor modules 2144a and 2144b. Each light sensor module 2144a and 2144b can be composed of a CCD or CMOS sensor and one or more optical elements such as filters, lenses, shutters and the like. In some respects, the components of the light sensor modules 2144a and 2144b can be fixed inside the probe with an elongated camera. In some other respects, one or more of the components of the light sensor modules 2144a and 2144b may be adjustable. For example, the CCD or CMOS sensor of a light sensor module 2144a and 2144b can be mounted on a movable bezel to allow automated adjustment of the center 2145a and 2145b of a field of view 2147a and 2147b of the CCD or CMOS sensor. In some other aspects, the CCD or CMOS sensor can be fixed, but a lens on each 2144a and 2144b light sensor module can be adjustable to change the focus. In some ways, light sensor modules
[0408] [0408] As shown in Figure 45, each of the sensor modules 2144a and 2144b can have a field of view 2147a and 2147b with an acceptance angle. As shown in Figure 45, the acceptance angle for each of the sensor modules 2144a and 2144b can have an acceptance angle greater than 90 °. In some examples, the acceptance angle can be about 100 °. In some instances, the acceptance angle can be about 120 °. In some examples, if the sensor modules 2144a and 2144b have an acceptance angle greater than 90 ° (for example, 100 °), fields of view 2147a and 2147b can form an overlap region 2150a and 2150b. In some ways, an optical field of view that has an acceptance angle of 100 ° or more can be called a "fisheye" field of view. A visualization system control system associated with such an elongated camera probe may include computer-readable instructions that may allow the display of the overlapping region 2150a and 2150b in such a way that the extreme curvature of the overlapping fisheye fields of view is corrected, and an improved and flattened image can be displayed. In Figure 45, the overlapping region 2150a can represent a region where the overlapping fields of view 2147a and 2147b of the sensor modules 2144a and 2144b have their respective centers 2145a and 2145b directed in a forward direction. However, if any one or more components of the sensor modules 2144a and 2144b are adjustable, it can be recognized that the overlap region 2150b can be aimed at any angle attainable within the fields of view 2147a and 2147b of the sensor modules 2144a and 2144b .
[0409] [0409] Figures 46A to 46D represent a variety of examples of an elongated light probe with two light sensor modules 2144a and 2144b with a variety of fields of view. The elongated light probe can be directed to view a 2152 surface of a surgical site.
[0410] [0410] In Figure 46A, the first light sensor module 2144a has a first field of view of sensor 2147a of a fabric surface 2154a, and the second light sensor module 2144b has a second field of view of sensor 2147b of a 2154b fabric surface. As shown in Figure 46A, the first field of view 2147a and the second field of view 2147b have approximately the same angle of view. In addition, the first field of view of sensor 2147a is in an adjacent position, but does not overlap, with the second field of view of sensor 2147b. The image received by the first light sensor module 2144a can be displayed separately from the image received by the second light sensor module 2144b, or the images can be combined to form a single image. In some non-limiting examples, the viewing angle of a lens associated with the first light sensor module 2144a and the viewing angle of a lens associated with the second light sensor module 2144b can be somewhat narrow, and the image distortion may not be good on the periphery of their respective images. Therefore, images can be easily combined from edge to edge.
[0411] [0411] As shown in Figure 46B, the first field of view 2147a and the second field of view 2147b have approximately the same angular field of view, and the first 2147a sensor field of view completely overlaps the second sensor field of view 2147b. This can result in a first sensor field of view 2147a of a fabric surface 2154a being identical to the view of a fabric surface 2154b as obtained by the second light sensor module 2144b from the second sensor field of view 2147b. This configuration can be useful for applications in which the image of the first light sensor module 2144a can be processed differently from the image of the second light sensor module 2144b. The information in the first image can complement the information in the second image and refer to the same piece of fabric.
[0412] [0412] As shown in Figure 46C, the first field of view 2147a and the second field of view 2147b have approximately the same angular field of view, and the first 2147a sensor field of view partially overlaps the second sensor field of view 2147b. In some non-limiting examples, a lens associated with the first light sensor module 2144a and a lens associated with the second light sensor module 2144b can be wide-angle lenses. These lenses can allow you to view a wider field of view than shown in Figure 46A. Wide-angle lenses are known to have significant optical distortion at their periphery. Proper image processing of the images obtained by the first light sensor module 2144a and the second light sensor module 2144b can allow the formation of a combined image in which the central portion of the combined image is corrected for any distortion induced by the first lens or the second lens. It can be understood that a portion of the first sensor field of view 2147a of a fabric surface 2154a may therefore have some distortion due to the wide-angle nature of a lens associated with the first light sensor module 2144a, and a portion of the second field of view of sensor 2147b of a fabric surface 2154b may therefore have some distortion due to the wide-angle nature of a lens associated with the second light sensor module 2144b. However, a portion of the fabric seen in the overlapping region 2150 'of the two light sensor modules 2144a and 2144b can be corrected for any distortion induced by any of the light sensor modules 2144a and 2144b. The configuration shown in Figure 46C can be useful for applications where you want to have a wide field of view of the tissue around a portion of a surgical instrument during a surgical procedure. In some examples, the lenses associated with each light sensor module 2144a and 2144b may be independently controllable, thereby controlling the location of the overlapping region 2150 'of the view within the combined image.
[0413] [0413] As shown in Figure 46D, the first light sensor module 2144a can have a first angular field of view 2147a that is wider than the second angle field of view 2147b of the second light sensor module 2144b. In some non-limiting examples, the second field of view of sensor 2147b can be fully arranged within the first field of view of sensor 2147a. In alternative examples, the second sensor field of view may be outside or tangent to the wide angle field of view 2147a of the first sensor 2144a. A display system that can use the configuration shown in Figure 46D can display a wide-angle fabric portion 2154a imaged by the first sensor module 2144a along with a second enlarged fabric portion 2154b imaged by the second sensor module 2144b and is located at a 2150 "overlap region of the first field of view 2147a and the second field of view 2147b. This configuration can be useful for presenting a surgeon with an image close to the tissue adjacent to a surgical instrument (for example, embedded in the second tissue portion 2154b) and a wide field image of the tissue around the immediate adjacency of the medical instrument (for example, the first portion of proximal tissue 2154a) .In some non-limiting examples, the image presented by the second narrower field of view 2147b of the second light sensor module 2144b can be a surface image of the surgical site. In some additional examples, the image shown in the prime The first wide field of view 2147a of the first light sensor module 2144a may include a display based on a hyperspectral analysis of the tissue viewed in the wide field of view.
[0414] [0414] Figures 47A to 47C illustrate an example of using an imaging system that incorporates the features revealed in Figure 46D. Figure 47A schematically illustrates a proximal view 2170 at the distal end of the elongated camera probe representing arrays of light sensors 2172a and 2172b of the two light sensor modules 2174a and 2174b. A first light sensor module 2174a may include a wide-angle lens, and the second light sensor module 2174b may include a narrow-angle lens. In some respects, the second 2174b light sensor module may have a narrow aperture lens. In other respects, the second light sensor module 2174b may have a magnifying glass. The fabric can be illuminated by the light sources arranged at the distal end of the probe with an elongated camera. The light sensor arrays 2172 '(the light sensor array 2172a or 2172b, or both 2172a and 2172b) can receive the light reflected by the fabric through illumination. The fabric can be illuminated by light from a red laser source, a green laser source, a blue laser source, an infrared laser source and / or an ultraviolet laser source. In some respects, light sensor arrays 2172 'can sequentially receive red laser light 2175a, green laser light 2175b, blue laser light 2175c, infrared laser light 2175d and ultraviolet laser light 2175e. The fabric can be illuminated by any combination of such laser sources simultaneously, as shown in Figures 23E and 23F. Alternatively, the illumination light may be cyclical between any combination of such laser sources, as shown, for example, in Figures 23D and Figures 43A and 43B.
[0415] [0415] Figure 47B schematically represents a portion of lung tissue 2180 that may contain a 2182 tumor. The 2182 tumor may be in communication with blood vessels, including one or more 2184 veins and / or arteries 2186. In some surgical procedures, the blood vessels (veins 2184 and arteries 2186) associated with the 2182 tumor may need resection and / or cauterization before the tumor is removed.
[0416] [0416] Figure 47C illustrates the use of a dual imaging system, as revealed above with respect to Figure 47A. The first light sensor module 2174a can capture a wide-angle image of the tissue surrounding a blood vessel 2187 to be cut with a surgical knife 2190. The wide-angle image can allow the surgeon to check the blood vessel to be separated 2187 In addition, the second light sensor module 2174b can capture a narrow angle image of the specific blood vessel 2187 to be manipulated. The narrow-angle image can show the surgeon the progress of manipulating the blood vessel 2187. In this way, the surgeon receives the image of the vascular tissue to be manipulated as well as its surroundings to ensure that the right blood vessel is being manipulated.
[0417] [0417] Figures 48A and 48B represent another example of using a dual imaging system. Figure 48A shows a primary surgical screen providing an image of a section of a surgical site. The primary surgical screen can show a 2800 wide view image of a section of intestine 2802 along with its 2804 vasculature. The 2800 wide view image can include a portion of surgical field 2809 that can be displayed separately as a 2810 enlarged view in one secondary surgical mesh (Figure
[0418] [0418] Figure 48B shows a secondary surgical screen that can display only a narrow enlarged view image 2810 of a portion of the surgical field 2809. The narrow enlarged view image 2810 can show a close view of the vascular tree 2814 so that the surgeon can focus on dissecting only the blood vessel of interest 2815. To perform resection of the blood vessel of interest 2815, a surgeon can use a 2816 intelligent RF cauterization device. It can be understood that any image obtained through the system Visualization can include not only images of the tissue at the surgical site, but also images of the surgical instruments inserted into it. In some respects, such a surgical screen (the primary screen in Figure 48A or the secondary screen in Figure 48B) may also include symbols 2817 related to the functions or settings of any surgical device used during the surgical procedure. For example, the 2817 symbols can include a power configuration from the 2816 smart RF cauterization device. In some respects, such smart medical devices can transmit data related to their operational parameters to the visualization system so that it incorporates them into safety data. to be transmitted to one or more display devices.
[0419] [0419] Figures 49A to 49C illustrate examples of a sequence of surgical steps for the removal of an intestinal / colon tumor and which may benefit from the use of multiple image analysis at the surgical site. Figure 49A shows a portion of the surgical site, including the 2932 intestines and the 2934 branched vasculature that supplies blood and nutrients to the 2932 intestines. The 2932 intestines may have a 2936 tumor surrounded by a 2937 tumor margin. The first sensor module of light from a viewing system can have a wide 2930 field of view and can provide imaging data from the wide 2930 field of view to a display system. The second light sensor module of the display system can have a narrow or standard 2940 field of view and can provide 2940 narrow field of view imaging data to the display system. In some ways, the wide-field image and the narrow-field image can be displayed by the same display device. In another aspect, the wide-field image and the narrow-field image can be displayed by separate devices.
[0420] [0420] During the surgical procedure, it may be important to remove not only the 2936 tumor, but the 2937 margin surrounding it to ensure complete removal of the tumor. A wide-angle field of view 2930 can be used to image both the 2934 vasculature and the 2932 intestine section surrounding the 2936 tumor and the 2637 margin. As noted above, the vasculature that feeds the 2936 tumor and the 2637 margin. it must be removed, but the vasculature that feeds the surrounding intestinal tissue must be preserved to provide oxygen and nutrients to the surrounding tissue. The transection of the vasculature that feeds the surrounding colon tissue will remove oxygen and nutrients from the tissue, leading to necrosis. In some examples, Doppler laser imaging of tissue viewed in the wide angle field 2630 can be analyzed to provide a contrast analysis of speckle 2933, indicating blood flow within the intestinal tissue.
[0421] [0421] Figure 49B illustrates a step during the surgical procedure. The surgeon may not be sure which part of the vascular tree supplies blood to the 2936 tumor. The surgeon can test a 2944 blood vessel to determine whether it feeds the 2936 tumor or healthy tissue. The surgeon can attach a 2944 blood vessel with a 2812 grasping device and determine the section of intestinal tissue 2943 that is no longer perfused by speckle contrast analysis. The narrow field of view 2940 displayed on an imaging device can assist the surgeon with the close view and detailed work that is required to view the single 2944 blood vessel to be tested. When the suspected 2944 blood vessel is arrested, it is determined that a portion of the 2943 intestinal tissue will lack perfusion based on the speckle contrast analysis of Doppler imaging. As shown in Figure 29B, the suspect blood vessel 2944 does not supply blood to the tumor 2935 or to the margin of the tumor 2937, and is therefore recognized as a blood vessel that must be spared during the surgical procedure.
[0422] [0422] Figure 49C represents a next stage of the surgical procedure. At the stage, a 2984 supply blood vessel was identified to supply blood to the 2937 margin of the tumor. When this 2984 supply blood vessel was separated, blood is no longer supplied to a section of the 2987 intestine that may include at least a portion of the 2937 margin of the 2936 tumor. In some respects, the lack of perfusion to the 2987 intestine section it can be determined by means of a speckle contrast analysis based on a Doppler analysis of blood flow to the intestine. The non-perfused section of the 2987 intestines can then be isolated by means of a 2985 seal applied to the intestine. In this way, only the blood vessels that perfuse the tissue indicated for surgical removal can be identified and sealed, thus saving healthy tissues from unintended surgical consequences.
[0423] [0423] In some additional aspects, a surgical visualization system may allow for imaging analysis of the surgical site.
[0424] [0424] In some respects, the surgical site can be inspected for the effectiveness of surgical manipulation of a tissue. Non-limiting examples of such inspections may include the inspection of surgical clamps or welds used to seal tissue at a surgical site. A coherent cone beam tomography using one or more light sources can be used for such methods.
[0425] [0425] In some additional aspects, an image of a surgical site may have landmarks indicated in the image. In some examples, reference points can be determined using image analysis techniques. In some alternative examples, the reference points can be indicated by manual intervention of the image by the surgeon.
[0426] [0426] In some additional aspects, non-intelligent ready-made visualization methods can be imported for use in image fusion techniques in the central controller.
[0427] [0427] In additional aspects, instruments that are not integrated into the central controller system can be identified and tracked during their use within the surgical site. In this regard, computational and / or storage components of the central controller or any of its components (including, for example, in the cloud-based system) may include a database of EES-related images and competitive surgical instruments that are identifiable from one or more images captured through any image capture system or through visual analysis of such alternative instruments. The imaging analysis of such devices can additionally allow for the identification of when an instrument is replaced by a different instrument to do the same or similar work. Identifying the replacement of an instrument during a surgical procedure can provide information related to when an instrument is not doing the job or information about a device failure. Situational recognition
[0428] [0428] Situational recognition is the ability of some aspects of a surgical system to determine or infer information related to a surgical procedure from data received from databases and / or instruments. The information may include the type of procedure being performed, the type of tissue being operated on or the body cavity that is the object of the procedure. With contextual information related to the surgical procedure, the surgical system can, for example, improve its way of controlling modular devices (for example, a robotic arm and / or robotic surgical instrument) that are connected to it and provide contextualized information or suggestions to the surgeon during the course of the surgical procedure.
[0429] [0429] Figure 50 shows a 5200 timeline representing the situational recognition of a central controller, such as central surgical controller 106 or 206, for example. Timeline 5200 is an illustrative surgical procedure and the contextual information that the central surgical controller 106, 206 can derive from data received from data sources at each stage in the surgical procedure. Timeline 5200 represents the typical steps that would be taken by nurses, surgeons, and other medical personnel during the course of a pulmonary segmentectomy procedure, starting with the setup of the operating room and ending with the transfer of the patient to an operating room. postoperative recovery.
[0430] [0430] Situational recognition of a central surgical controller 106, 206 receives data from data sources throughout the course of the surgical procedure, including data generated each time medical personnel use a modular device that is paired with the operating room 106 , 206. Central surgical controller 106, 206 can receive this data from paired modular devices and other data sources and continually derive inferences (ie, contextual information) about the ongoing procedure as new data is received, such as procedure step is being performed at a given time. The situational recognition system of the central surgical controller 106, 206 is capable of, for example, recording data related to the procedure to generate reports, checking the steps being taken by medical personnel, providing data or warnings (for example, through a display) that may be relevant to the specific step of the procedure, adjust the modular devices based on the context (for example, activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of a ultrasonic surgical instrument or RF electrosurgical instrument), and take any other action described above.
[0431] [0431] In the first step 5202, in this illustrative procedure, members of the hospital team retrieve the patient's electronic medical record (PEP) from the hospital's PEP database. Based on patient selection data in the PEP, the central surgical controller 106, 206 determines that the procedure to be performed is a thoracic procedure.
[0432] [0432] In the second step 5204, the team members scan the incoming medical supplies for the procedure. Central surgical controller 106, 206 cross-references the scanned supplies with a list of supplies that are used in various types of procedures and confirms that the supply mix corresponds to a thoracic procedure. In addition, the central surgical controller 106, 206 is also able to determine that the procedure is not a wedge procedure (because the inlet supplies have an absence of certain supplies that are necessary for a thoracic wedge procedure or, otherwise, that inlet supplies do not correspond to a thoracic wedge procedure).
[0433] [0433] In the third step 5206, medical personnel scan the patient's band with a scanner that is communicably connected to the central surgical controller 106, 206. The central surgical controller 106, 206 can then confirm the patient's identity based on the scanned data.
[0434] [0434] In the fourth step 5208, the medical staff turns on the auxiliary equipment. The auxiliary equipment being used may vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, an insufflator and a medical imaging device. When activated, auxiliary equipment that is modular devices can automatically pair with the central surgical controller 106, 206 which is located within a specific neighborhood of modular devices as part of their initialization process. The central surgical controller 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that correspond with it during that preoperative or initialization phase. In this particular example, the central surgical controller 106, 206 determines that the surgical procedure is a VATS (video-assisted thoracic surgery) procedure based on this specific combination of paired modular devices. Based on the combination of data from the electronic patient record (PEP), the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the central controller, the central surgical controller 106, 206 can, in general , infer the specific procedure that the surgical team will perform. After the central surgical controller 106, 206 recognizes which specific procedure is being performed, the central surgical controller 106, 206 can then retrieve the steps of that process from a memory or from the cloud and then cross the data it subsequently receives from the connected data sources (for example, modular devices and patient monitoring devices) to infer which stage of the surgical procedure the surgical team is performing.
[0435] [0435] In the fifth step 5210, the team members fix the electrocardiogram (ECG) electrodes and other patient monitoring devices on the patient. ECG electrodes and other patient monitoring devices are able to pair with central surgical controller 106, 206. As central surgical controller 106, 206 begins to receive data from patient monitoring devices, central surgical controller 106, 206 thus confirming that the patient is in the operating room.
[0436] [0436] In the sixth step 5212, medical personnel induced anesthesia in the patient. Central surgical controller 106, 206 can infer that the patient is under anesthesia based on data from modular devices and / or patient monitoring devices, including ECG data, blood pressure data, ventilator data, or combinations of themselves, for example. After the completion of the sixth step 5212, the preoperative portion of the lung segmentectomy procedure is completed and the operative portion begins.
[0437] [0437] In the seventh step 5214, the lung of the patient being operated on is retracted (while ventilation is switched to the contralateral lung). The central surgical controller 106, 206 can infer from the ventilator data that the patient's lung has been retracted, for example. Central surgical controller 106, 206 can infer that the operative portion of the procedure started when it can compare the detection of the patient's lung collapse at the expected stages of the procedure (which can be accessed or retrieved earlier) and thus determine that the retraction of the patient lung is the first operative step in this specific procedure.
[0438] [0438] In the eighth step 5216, the medical imaging device (for example, a display device) is inserted and the video from the medical imaging device is started. Central surgical controller 106, 206 receives data from the medical imaging device (i.e., video or image data) through its connection to the medical imaging device. Upon receipt of data from the medical imaging device, the central surgical controller 106, 206 can determine that the portion of the laparoscopic surgical procedure has started. In addition, the central surgical controller 106, 206 can determine that the specific procedure being performed is a segmentectomy, rather than a lobectomy (note that a wedge procedure has already been discarded by the central surgical controller 106, 206 based on the data received in the second step 5204 of the procedure). The medical imaging device data 124 (Figure 2) can be used to determine contextual information about the type of procedure being performed in a number of different ways, including by determining the angle at which the medical imaging device is oriented in in relation to visualizing the patient's anatomy, monitoring the number or medical imaging devices being used (ie, which are activated and paired with the operating room 106, 206), and monitoring the types of visualization devices used.
[0439] [0439] In the ninth step 5218 of the procedure, the surgical team starts the dissection step. Central surgical controller 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because he receives data from the RF or ultrasonic generator that indicate that an energy instrument is being fired. The central surgical controller 106, 206 can cross-check the received data with the steps retrieved from the surgical procedure to determine that an energy instrument being fired at that point in the process (that is, after the completion of the previously discussed steps of the procedure) corresponds to the step of dissection. In certain cases, the energy instrument may be a power tool mounted on a robotic arm in a robotic surgical system.
[0440] [0440] In the tenth step 5220 of the procedure, the surgical team proceeds to the connection step. Central surgical controller 106, 206 can infer that the surgeon is ligating the arteries and veins because he receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similar to the previous step, the central surgical controller 106, 206 can derive this inference by crossing the reception data of the stapling and surgical cutting instrument with the steps recovered in the process. In certain cases, the surgical instrument can be a surgical tool mounted on a robotic arm of a robotic surgical system.
[0441] [0441] In the eleventh step 5222, the segmentectomy portion of the procedure is performed. Central surgical controller 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of clamp being triggered by the instrument, for example. As different types of staples are used for different types of fabrics, the cartridge data can thus indicate the type of fabric being stapled and / or transected. In this case, the type of clamp that is fired is used for the parenchyma (or other similar types of tissue), which allows the central surgical controller 106, 206 to infer which segmentectomy portion of the procedure is being performed.
[0442] [0442] In the twelfth step 5224, the node dissection step is then performed. The central surgical controller 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on the data received from the generator that indicates which ultrasonic or RF instrument is being fired. For this specific procedure, an RF or ultrasonic instrument being used after the parenchyma has been transected corresponds to the node dissection step, which allows the central surgical controller 106, 206 to make this inference. It should be noted that surgeons regularly switch between surgical stapling / cutting instruments and surgical energy instruments (that is, RF or ultrasonic) depending on the specific step in the procedure because different instruments are better adapted for specific tasks. Therefore, the specific sequence in which cutting / stapling instruments and surgical energy instruments are used can indicate which step of the procedure the surgeon is performing. In addition, in certain cases, robotic tools can be used for one or more steps in a surgical procedure and / or hand-held surgical instruments can be used for one or more steps in the surgical procedure. The surgeon can switch between robotic tools and hand-held surgical instruments and / or can use the devices simultaneously, for example. After the completion of the twelfth stage 5224, the incisions are closed and the post-operative portion of the process begins.
[0443] [0443] In the thirteenth step 5226, the patient's anesthesia is reversed. The central surgical controller 106, 206 can infer that the patient is emerging from anesthesia based on ventilator data (i.e., the patient's respiratory rate begins to increase), for example.
[0444] [0444] Finally, in the fourteenth step 5228 is that medical personnel remove the various patient monitoring devices from the patient. Central surgical controller 106, 206 can thus infer that the patient is being transferred to a recovery room when the central controller loses ECG, blood pressure and other data from patient monitoring devices. As can be seen from the description of this illustrative procedure, the central surgical controller 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to the data received from the various data sources that are communicably coupled to the controller central surgery 106, 206.
[0445] [0445] Situational recognition is further described in US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, the description of which is incorporated herein by reference in its entirety . In certain cases, the operation of a robotic surgical system, including the various robotic surgical systems disclosed here, for example, can be controlled by the central controller 106, 206 based on its situational perception and / or feedback from its components and / or based on information from cloud 102.
[0446] [0446] Various aspects of the subject described in this document are defined in the following numbered examples.
[0447] [0447] Example 1. A surgical image capture system comprising: a plurality of light sources in which each light source is configured to emit light that has a specified central wavelength; a light sensor configured to receive a portion of the reflected light from a tissue sample when illuminated by one or more of the plurality of light sources; and a computing system, in which the computing system is configured to: receive data from the light sensor when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a structure composition.
[0448] [0448] Example 2. The surgical image capture system of Example 1, where the plurality of light sources comprises at least one of a red light source, a green light source and a light source of blue light.
[0449] [0449] Example 3. The surgical image capture system of any of Examples 1 and 2, wherein the plurality of light sources comprises at least one of an infrared light source and an ultraviolet light source .
[0450] [0450] Example 4. The surgical image capture system of any of Examples 1 to 3, in which the computer system, configured to calculate the structural data related to a characteristic of a structure within the tissue, comprises a system of computation configured to calculate structural data related to a composition of a structure within the tissue.
[0451] [0451] Example 5. The surgical image capture system of any of Examples 1 to 4, in which the computer system, configured to calculate the structural data related to a characteristic of a structure within the tissue, comprises a system of computation configured to calculate structural data related to a surface roughness of a structure within the tissue.
[0452] [0452] Example 6. A surgical image capture system comprising: a processor; and a processor coupled memory, where the memory stores instructions executable by the processor to: control the operation of a plurality of light sources from a tissue sample in which each light source is configured to emit light with a central wavelength specified; receiving light sensor data when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a structure composition.
[0453] [0453] Example 7. The surgical image capture system of Example 6, in which the instructions executable by the processor to control the operation of a plurality of light sources comprise one or more instructions for illuminating the tissue sample sequentially for each one. among the plurality of lighting sources.
[0454] [0454] Example 8. The surgical image capture system from Example 6 to Example 7, in which the instructions executable by the processor to calculate the structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensors comprise one or more instructions for calculating structural data related to a feature of a structure within the tissue sample based on a phase shift of the reflected light from the tissue sample.
[0455] [0455] Example 9. The surgical imaging system of any of Examples 6 to 8, wherein the composition of the structure comprises a relative composition of collagen and elastin in a tissue.
[0456] [0456] Example 10. The system for capturing surgical images of any of Examples 6 to 9, wherein the composition of the structure comprises a quantity of hydration of a tissue.
[0457] [0457] Example 11. A surgical image capture system comprising: a control circuit configured to: control the operation of a plurality of light sources from a tissue sample in which each light source is configured to emit light with a specified central wavelength; receiving light sensor data when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a structure composition.
[0458] [0458] Example 12. The surgical image capture system of Example 11, in which the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by an intelligent surgical device, in which the intelligent surgical device is a smart surgical stapler.
[0459] [0459] Example 13. The system for capturing surgical images from any of Example 12, in which the control circuit is additionally configured to transmit data related to an anvil pressure based on the characteristic of the structure to be received by the stapler smart surgical.
[0460] [0460] Example 14. The surgical image capture system of any of Examples 11 to 13, in which the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by an intelligent surgical device, in that the smart surgical device is an intelligent RF sealing surgical device.
[0461] [0461] Example 15. The surgical image capture system of Example 14, in which the control circuit is additionally configured to transmit data related to an amount of RF power based on the characteristic of the structure to be received by the intelligent scanning device. RF sealing.
[0462] [0462] Example 16. The system for capturing surgical images from any of Examples 11 to 15, in which the control circuit is configured to transmit structural data related to the characteristic of the structure to be received by an intelligent surgical device, in which the smart surgical device is an intelligent ultrasound cutting device.
[0463] [0463] Example 17. The surgical image capture system of Example 16, in which the control circuit is additionally configured to transmit data related to a quantity of power to an ultrasonic transducer or a frequency of activation of the ultrasonic transducer based on characteristic of the structure to be received by the ultrasonic cutting device.
[0464] [0464] Example 18. A non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, make a machine: control the operation of a plurality of light sources from a tissue sample where each light source is configured to emit light with a specified central wavelength; receiving light sensor data when the tissue sample is illuminated by each of the plurality of light sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a structure composition.
[0465] [0465] Although several forms have been illustrated and described, it is not the applicant's intention to restrict or limit the scope of the claims attached to such detail. Numerous modifications, variations, alterations, substitutions, combinations and equivalents of these forms can be implemented and will occur to those skilled in the art without departing from the scope of the present description. In addition, the structure of each element associated with the shape can alternatively be described as a means of providing the function performed by the element. In addition, where materials are revealed for certain components, other materials can be used. It should be understood, therefore, that the preceding description and the appended claims are intended to cover all such modifications,
[0466] [0466] The previous detailed description presented various forms of devices and / or processes through the use of block diagrams, flowcharts and / or examples. Although these block diagrams, flowcharts and / or examples contain one or more functions and / or operations, it will be understood by those skilled in the art that each function and / or operation within these block diagrams, flowcharts and / or examples can be implemented, individually and / or collectively, through a wide range of hardware, software, firmware or virtually any combination thereof. Those skilled in the art will recognize, however, that some aspects of the aspects disclosed herein, in whole or in part, can be implemented in an equivalent manner in integrated circuits, such as one or more computer programs running on one or more computers (for example, as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (for example, as one or more programs running on one or more microprocessors), as firmware, or virtually as any combination of them, and that designing the circuitry and / or writing the code for the software and firmware would be within the scope of practice of those skilled in the art, in the light of this description. In addition, those skilled in the art will understand that the mechanisms of the subject described herein can be distributed as one or more program products in a variety of ways and that an illustrative form of the subject described here is applicable regardless of the specific type of transmission medium. signals used to effectively carry out the distribution.
[0467] [0467] The instructions used to program the logic to execute various revealed aspects can be stored in a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory or other storage. In addition, instructions can be distributed over a network or through other computer-readable media. Thus, machine-readable media can include any mechanism to store or transmit information in a machine-readable form (for example, a computer), but is not limited to, floppy disks, optical discs, read-only compact disc ( CD-ROMs), and optical-dynamos discs, read-only memory (ROM), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), cards magnetic or optical, flash memory, or machine-readable tangible storage media used to transmit information over the Internet via an electrical, optical, acoustic cable or other forms of propagation signals (for example, carrier waves, infrared signal, digital signals, etc.). Consequently, computer-readable non-transitory media includes any type of machine-readable media suitable for storing or transmitting instructions or electronic information in a machine-readable form (for example, a computer).
[0468] [0468] As used in any aspect of the present invention, the term "control circuit" can refer to, for example, a set of wired circuits, programmable circuits [for example, a computer processor comprising one or more cores individual instruction processing units, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (PSD), programmable logic device (PLD), programmable logic matrix (PLA), or field programmable port arrangement ( FPGA) [, state machine circuits, firmware that stores instructions executed by the programmable circuit, and any combination thereof. The control circuit can, collectively or individually, be incorporated as an electrical circuit that is part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), an on-chip system (SoC ), desktop computers, laptop computers, tablet computers, servers, smart headsets, etc. Consequently, as used in the present invention, "control circuit" includes, but is not limited to, electrical circuits that have at least one discrete electrical circuit, electrical circuits that have at least one integrated circuit, electrical circuits that have at least one circuit integrated for specific application, electrical circuits that form a general purpose computing device configured by a computer program (for example, a general purpose computer configured by a computer program that at least partially executes processes and / or devices described herein, or a microprocessor configured by a computer program that at least partially performs the processes and / or devices described here), electrical circuits that form a memory device (for example, forms of random access memory), and / or electrical circuits that form a communications device (for example, a modem, communication key, or eq optical-electrical equipment). Those skilled in the art will recognize that the subject described here can be implemented in an analog or digital way, or in some combination of these.
[0469] [0469] As used in any aspect of the present invention, the term "logical" can refer to an application, software, firmware and / or circuit configured to perform any of the aforementioned operations. The software may be incorporated as a software package, code, instructions, instruction sets and / or data recorded on the computer-readable non-transitory storage media. The firmware can be embedded as code, instructions or instruction sets and / or data that are hard-coded (for example, non-volatile) in memory devices.
[0470] [0470] As used in any aspect of the present invention, the terms "component", "system", "module" and the like may refer to a computer-related entity, be it hardware, a combination of hardware and software, software or software running.
[0471] [0471] As used here in one aspect of the present invention, an "algorithm" refers to the self-consistent sequence of steps that lead to the desired result, where a "step" refers to the manipulation of physical quantities and / or logical states that can, although they do not necessarily need to, take the form of electrical or magnetic signals that can be stored, transferred, combined, compared and manipulated in any other way. It is common use to call these signs bits, values, elements, symbols, characters, terms, numbers or the like. These terms and similar terms may be associated with the appropriate physical quantities and are merely convenient identifications applied to these quantities and / or states.
[0472] [0472] A network can include a packet-switched network. Communication devices may be able to communicate with each other using a selected packet switched network communications protocol. An exemplary communications protocol may include an Ethernet communications protocol that may be able to allow communication using a transmission control protocol / Internet protocol (TCP / IP). The protocol
[0473] [0473] Unless otherwise stated, as is evident from the preceding description, it is understood that, throughout the preceding description, discussions using terms such as "processing" or "computation" or "calculation" or "determination" or "display" or similar refers to the action and processes of a computer, or similar electronic computing device, that manipulates and transforms the data represented in the form of physical (electronic) quantities in the computer's records and memories into other represented data similarly in the form of physical quantities in the memories or records of the computer, or in other similar devices for storing, transmitting or displaying information.
[0474] [0474] One or more components in the present invention may be called "configured for", "configurable for", "operable / operational for", "adapted / adaptable for", "capable of", "conformable / conformed for", etc. Those skilled in the art will recognize that "configured for" may, in general, cover components in an active state and / or components in an inactive state and / or components in a standby state, except when the context determines otherwise.
[0475] [0475] The terms "proximal" and "distal" are used in the present invention with reference to a physician who handles the handle portion of the surgical instrument. The term "proximal" refers to the portion closest to the doctor, and the term "distal" refers to the portion located opposite the doctor. It will also be understood that, for the sake of convenience and clarity, spatial terms such as "vertical", "horizontal", "up" and "down" can be used in the present invention with respect to the drawings. However, surgical instruments can be used in many orientations and positions, and these terms are not intended to be limiting and / or absolute.
[0476] [0476] Persons skilled in the art will recognize that, in general, the terms used here, and especially in the appended claims (eg, bodies of the attached claims) are generally intended as "open" terms (eg, the term "including" should be interpreted as "including, but not limited to", the term
[0477] [0477] Furthermore, even if a specific number of an introduced claim statement is explicitly mentioned, those skilled in the art will recognize that that statement must typically be interpreted as meaning at least the number mentioned (for example, the mere mention of "two mentions ", without other modifiers, typically means at least two mentions, or two or more mentions). In addition, in cases where a convention analogous to "at least one of A, B and C, etc." is used, in general this construction is intended to have the meaning in which the convention would be understood by (for example, "a system that has at least one among
[0478] [0478] With respect to the attached claims, those skilled in the art will understand that the operations mentioned in them can, in general, be performed in any order. In addition, although several operational flow diagrams are presented in one or more sequences, it must be understood that the various operations can be performed in other orders than those shown, or can be performed simultaneously. Examples of such alternative orderings may include overlapping, merged, interrupted, reordered, incremental, preparatory, supplementary, simultaneous, inverse or other variant orders, unless the context otherwise requires. In addition, terms such as "responsive to", "related to" or other adjectival participles are not intended in general to exclude these variants, unless the context otherwise requires.
[0479] [0479] It is worth noting that any reference to "one (1) aspect", "one aspect", "an exemplification" or "one (1) exemplification", and the like means that a particular feature, structure or feature described in connection with the aspect is included in at least one aspect. Thus, the use of expressions such as "in one (1) aspect", "in one aspect", "in an exemplification", "in one (1) exemplification", in several places throughout this specification does not necessarily refer the same aspect. In addition, specific features, structures or characteristics can be combined in any appropriate way in one or more aspects.
[0480] [0480] Any patent application, patent, non-patent publication or other description material mentioned in this specification and / or mentioned in any order data sheet is hereby incorporated by reference, to the extent that the materials incorporated are not inconsistent with that. Accordingly, and to the extent necessary, the description as explicitly presented herein replaces any conflicting material incorporated by reference to the present invention. Any material, or portion thereof, which is incorporated herein by reference, but which conflicts with the definitions, statements, or other description materials contained herein, will be incorporated here only to the extent that there is no conflict between the embedded material and the existing description material.
[0481] [0481] In summary, numerous benefits have been described that result from the use of the concepts described in this document. The previously mentioned description of one or more modalities has been presented for purposes of illustration and description. This description is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications or variations are possible in light of the above teachings. One or more modalities were chosen and described in order to illustrate the principles and practical application to, thus, allow those skilled in the art to use the various modalities and with various modifications, as they are convenient to the specific use contemplated.
It is intended that the claims presented in the annex define the global scope.
权利要求:
Claims (18)
[1]
1. Surgical image capture system characterized by comprising: a plurality of light sources, in which each light source is configured to emit light that has a specified central wavelength; a light sensor configured to receive a portion of the reflected light from a tissue sample when illuminated by one or more of the plurality of light sources; and a computing system, in which the computing system is configured to: receive data from the light sensor when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a composition of the structure.
[2]
2. Surgical image capture system according to claim 1, characterized in that the plurality of light sources comprises at least one of a red light source, a green light source and a blue light.
[3]
3. Surgical image capture system according to claim 1, characterized in that the plurality of light sources comprises at least one of an infrared light source and an ultraviolet light source.
[4]
4. Surgical image capture system, according to claim 1, characterized in that the computer system, configured to calculate the structural data related to a characteristic of a structure within the tissue, comprises a computer system configured to calculate structural data related to a composition of a structure within the tissue.
[5]
5. Surgical image capture system, according to claim 1, characterized in that the computer system, configured to calculate the structural data related to a characteristic of a structure within the tissue, comprises a computer system configured to calculate structural data related to a surface roughness of a structure within the tissue.
[6]
6. Surgical image capture system characterized by comprising: a processor; and a memory coupled to the processor, where the memory stores instructions executable by the processor to: control the operation of a plurality of light sources from a tissue sample, where each light source is configured to emit light that has a length of specified central wave; receiving light sensor data when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a composition of the structure.
[7]
7. Surgical image capture system, according to claim 6, characterized in that the instructions executable by the processor to control the operation of a plurality of light sources comprise one or more instructions to illuminate the tissue sample sequentially by each one among the plurality of lighting sources.
[8]
8. Surgical image capture system, according to claim 6, characterized in that the instructions executable by the processor to calculate the structural data related to a characteristic of a structure within the tissue sample based on the data received by the light sensor comprise one or more instructions for calculating structural data related to a feature of a structure within the tissue sample based on a phase shift of the reflected light from the tissue sample.
[9]
9. Surgical image capture system according to claim 6, characterized in that the structure composition comprises a relative composition of collagen and elastin in a tissue.
[10]
10. Surgical image capture system according to claim 6, characterized in that the structure composition comprises a quantity of hydration of a tissue.
[11]
11. Surgical image capture system characterized by comprising: a control circuit configured to: control the operation of a plurality of light sources from a tissue sample, in which each light source is configured to emit light that has a length specified central wave; receiving light sensor data when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a composition of the structure.
[12]
12. Surgical image capture system, according to claim 11, characterized in that the control circuit is configured to transmit the structural data related to the characteristic of the structure to be received by an intelligent surgical device, in which the intelligent surgical device is an intelligent surgical stapler.
[13]
13. Surgical image capture system, according to claim 12, characterized in that the control circuit is additionally configured to transmit data related to an anvil pressure based on the characteristic of the structure to be received by the intelligent surgical stapler.
[14]
14. Surgical image capture system, according to claim 11, characterized in that the control circuit is configured to transmit structural data related to the characteristic of the structure to be received by an intelligent surgical device, in which the intelligent surgical device is a intelligent surgical RF sealing device.
[15]
15. Surgical image capture system, according to claim 14, characterized in that the control circuit is additionally configured to transmit data related to an amount of RF power based on the characteristic of the structure to be received by the intelligent sealing device by RF.
[16]
16. Surgical image capture system, according to claim 11, characterized in that the control circuit is configured to transmit structural data related to the characteristic of the structure to be received by an intelligent surgical device, in which the intelligent surgical device is a intelligent ultrasound cutting device.
[17]
17. Surgical image capture system, according to claim 16, characterized in that the control circuit is additionally configured to transmit data related to an amount of power supplied to an ultrasonic transducer or a frequency of activation of the ultrasonic transducer based on characteristic of the structure to be received by the ultrasonic cutting device.
[18]
18. Computer-readable non-transitory media, characterized by storing computer-readable instructions that, when executed, make a machine: control the operation of a plurality of light sources from a tissue sample, in which each light source is configured to emit light that has a specified central wavelength; receiving light sensor data when the tissue sample is illuminated by each of the plurality of lighting sources; calculate structural data related to a characteristic of a tissue sample structure based on the data received by the light sensor when the tissue sample is illuminated by each light source; and transmitting the structural data related to the structure characteristic to be received by an intelligent surgical device, in which the structure characteristic is a surface characteristic or a composition of the structure.
类似技术:
公开号 | 公开日 | 专利标题
BR112020012974A2|2020-11-24|characterization of tissue irregularities through the use of monochromatic light refractivity
US11100631B2|2021-08-24|Use of laser light and red-green-blue coloration to determine properties of back scattered light
US20210212602A1|2021-07-15|Dual cmos array imaging
US11213359B2|2022-01-04|Controllers for robot-assisted surgical platforms
US11132462B2|2021-09-28|Data stripping method to interrogate patient records and create anonymized record
US20190201021A1|2019-07-04|Surgical instrument having a flexible circuit
US20190201079A1|2019-07-04|Surgical instrument having a flexible electrode
BR112020012896A2|2020-12-08|SELF-DESCRIPTIVE DATA PACKAGES GENERATED IN AN EMISSION INSTRUMENT
BR112020013138A2|2020-12-01|data pairing to interconnect a measured parameter from a device with a result
BR112020012849A2|2020-12-29|CENTRAL COMMUNICATION CONTROLLER AND STORAGE DEVICE FOR STORAGE AND STATE PARAMETERS AND A SURGICAL DEVICE TO BE SHARED WITH CLOUD-BASED ANALYSIS SYSTEMS
BR112020012966A2|2020-12-01|drive arrangements for robot-assisted surgical platforms
BR112020012806A2|2020-11-24|aggregation and reporting of data from a central surgical controller
BR112020012556A2|2020-11-24|surgical instrument that has a flexible electrode
BR112020013233A2|2020-12-01|capacitive coupled return path block with separable matrix elements
BR112020012718A2|2020-12-01|surgical instrument that has a flexible circuit
同族专利:
公开号 | 公开日
US20190200905A1|2019-07-04|
EP3506297A1|2019-07-03|
WO2019130073A1|2019-07-04|
CN111526772A|2020-08-11|
JP2021509316A|2021-03-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US20070239153A1|2006-02-22|2007-10-11|Hodorek Robert A|Computer assisted surgery system using alternative energy technology|
US7995045B2|2007-04-13|2011-08-09|Ethicon Endo-Surgery, Inc.|Combined SBI and conventional image processor|
US7982776B2|2007-07-13|2011-07-19|Ethicon Endo-Surgery, Inc.|SBI motion artifact removal apparatus and method|
JP5216429B2|2008-06-13|2013-06-19|富士フイルム株式会社|Light source device and endoscope device|
EP2391259A1|2009-01-30|2011-12-07|The Trustees Of Columbia University In The City Of New York|Controllable magnetic source to fixture intracorporeal apparatus|
US8986302B2|2009-10-09|2015-03-24|Ethicon Endo-Surgery, Inc.|Surgical generator for ultrasonic and electrosurgical devices|
JP5623348B2|2011-07-06|2014-11-12|富士フイルム株式会社|Endoscope system, processor device for endoscope system, and method for operating endoscope system|
JP5502812B2|2011-07-14|2014-05-28|富士フイルム株式会社|Biological information acquisition system and method of operating biological information acquisition system|
US9516239B2|2012-07-26|2016-12-06|DePuy Synthes Products, Inc.|YCBCR pulsed illumination scheme in a light deficient environment|
US9743016B2|2012-12-10|2017-08-22|Intel Corporation|Techniques for improved focusing of camera arrays|
US10098527B2|2013-02-27|2018-10-16|Ethidcon Endo-Surgery, Inc.|System for performing a minimally invasive surgical procedure|
US20140263552A1|2013-03-13|2014-09-18|Ethicon Endo-Surgery, Inc.|Staple cartridge tissue thickness sensor system|
EP2967294B1|2013-03-15|2020-07-29|DePuy Synthes Products, Inc.|Super resolution and color motion artifact correction in a pulsed color imaging system|
AU2014233193B2|2013-03-15|2018-11-01|DePuy Synthes Products, Inc.|Controlling the integral light energy of a laser pulse|
WO2016149794A1|2015-03-26|2016-09-29|Surgical Safety Technologies Inc.|Operating room black-box device, system, method and computer readable medium|
EP3730086A1|2014-12-16|2020-10-28|Intuitive Surgical Operations, Inc.|Ureter detection using waveband-selective imaging|
US10687884B2|2015-09-30|2020-06-23|Ethicon Llc|Circuits for supplying isolated direct current voltage to surgical instruments|
US10058393B2|2015-10-21|2018-08-28|P Tech, Llc|Systems and methods for navigation and visualization|
US20170296213A1|2016-04-15|2017-10-19|Ethicon Endo-Surgery, Llc|Systems and methods for controlling a surgical stapling and cutting instrument|US20070084897A1|2003-05-20|2007-04-19|Shelton Frederick E Iv|Articulating surgical stapling instrument incorporating a two-piece e-beam firing mechanism|
US8215531B2|2004-07-28|2012-07-10|Ethicon Endo-Surgery, Inc.|Surgical stapling instrument having a medical substance dispenser|
US9237891B2|2005-08-31|2016-01-19|Ethicon Endo-Surgery, Inc.|Robotically-controlled surgical stapling devices that produce formed staples having different lengths|
US11246590B2|2005-08-31|2022-02-15|Cilag Gmbh International|Staple cartridge including staple drivers having different unfired heights|
US7669746B2|2005-08-31|2010-03-02|Ethicon Endo-Surgery, Inc.|Staple cartridges for forming staples having differing formed staple heights|
US11224427B2|2006-01-31|2022-01-18|Cilag Gmbh International|Surgical stapling system including a console and retraction assembly|
US8186555B2|2006-01-31|2012-05-29|Ethicon Endo-Surgery, Inc.|Motor-driven surgical cutting and fastening instrument with mechanical closure system|
US7845537B2|2006-01-31|2010-12-07|Ethicon Endo-Surgery, Inc.|Surgical instrument having recording capabilities|
US11207064B2|2011-05-27|2021-12-28|Cilag Gmbh International|Automated end effector component reloading system for use with a robotic system|
US8684253B2|2007-01-10|2014-04-01|Ethicon Endo-Surgery, Inc.|Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor|
US8931682B2|2007-06-04|2015-01-13|Ethicon Endo-Surgery, Inc.|Robotically-controlled shaft based rotary drive systems for surgical instruments|
US9585657B2|2008-02-15|2017-03-07|Ethicon Endo-Surgery, Llc|Actuator for releasing a layer of material from a surgical end effector|
US9386983B2|2008-09-23|2016-07-12|Ethicon Endo-Surgery, Llc|Robotically-controlled motorized surgical instrument|
US8210411B2|2008-09-23|2012-07-03|Ethicon Endo-Surgery, Inc.|Motor-driven surgical cutting instrument|
US8517239B2|2009-02-05|2013-08-27|Ethicon Endo-Surgery, Inc.|Surgical stapling instrument comprising a magnetic element driver|
US20110024477A1|2009-02-06|2011-02-03|Hall Steven G|Driven Surgical Stapler Improvements|
US9861361B2|2010-09-30|2018-01-09|Ethicon Llc|Releasable tissue thickness compensator and fastener cartridge having the same|
US9072535B2|2011-05-27|2015-07-07|Ethicon Endo-Surgery, Inc.|Surgical stapling instruments with rotatable staple deployment arrangements|
RU2636861C2|2012-06-28|2017-11-28|Этикон Эндо-Серджери, Инк.|Blocking of empty cassette with clips|
US9364230B2|2012-06-28|2016-06-14|Ethicon Endo-Surgery, Llc|Surgical stapling instruments with rotary joint assemblies|
US11197671B2|2012-06-28|2021-12-14|Cilag Gmbh International|Stapling assembly comprising a lockout|
RU2669463C2|2013-03-01|2018-10-11|Этикон Эндо-Серджери, Инк.|Surgical instrument with soft stop|
US9629629B2|2013-03-14|2017-04-25|Ethicon Endo-Surgey, LLC|Control systems for surgical instruments|
MX369362B|2013-08-23|2019-11-06|Ethicon Endo Surgery Llc|Firing member retraction devices for powered surgical instruments.|
US20150053746A1|2013-08-23|2015-02-26|Ethicon Endo-Surgery, Inc.|Torque optimization for surgical instruments|
US11259799B2|2014-03-26|2022-03-01|Cilag Gmbh International|Interface systems for use with surgical instruments|
JP6612256B2|2014-04-16|2019-11-27|エシコンエルエルシー|Fastener cartridge with non-uniform fastener|
US9757128B2|2014-09-05|2017-09-12|Ethicon Llc|Multiple sensors with one sensor affecting a second sensor's output or interpretation|
BR112017004361A2|2014-09-05|2017-12-05|Ethicon Llc|medical overcurrent modular power supply|
BR112017005981A2|2014-09-26|2017-12-19|Ethicon Llc|surgical staplers and ancillary materials|
US9924944B2|2014-10-16|2018-03-27|Ethicon Llc|Staple cartridge comprising an adjunct material|
US11141153B2|2014-10-29|2021-10-12|Cilag Gmbh International|Staple cartridges comprising driver arrangements|
US11154301B2|2015-02-27|2021-10-26|Cilag Gmbh International|Modular stapling assembly|
US9993248B2|2015-03-06|2018-06-12|Ethicon Endo-Surgery, Llc|Smart sensors with local signal processing|
US10245033B2|2015-03-06|2019-04-02|Ethicon Llc|Surgical instrument comprising a lockable battery housing|
US10299878B2|2015-09-25|2019-05-28|Ethicon Llc|Implantable adjunct systems for determining adjunct skew|
US10265068B2|2015-12-30|2019-04-23|Ethicon Llc|Surgical instruments with separable motors and motor control circuits|
US10292704B2|2015-12-30|2019-05-21|Ethicon Llc|Mechanisms for compensating for battery pack failure in powered surgical instruments|
US10368865B2|2015-12-30|2019-08-06|Ethicon Llc|Mechanisms for compensating for drivetrain failure in powered surgical instruments|
US11213293B2|2016-02-09|2022-01-04|Cilag Gmbh International|Articulatable surgical instruments with single articulation link arrangements|
US11224426B2|2016-02-12|2022-01-18|Cilag Gmbh International|Mechanisms for compensating for drivetrain failure in powered surgical instruments|
US10335145B2|2016-04-15|2019-07-02|Ethicon Llc|Modular surgical instrument with configurable operating mode|
US11179150B2|2016-04-15|2021-11-23|Cilag Gmbh International|Systems and methods for controlling a surgical stapling and cutting instrument|
US10456137B2|2016-04-15|2019-10-29|Ethicon Llc|Staple formation detection mechanisms|
US10368867B2|2016-04-18|2019-08-06|Ethicon Llc|Surgical instrument comprising a lockout|
JP2020501779A|2016-12-21|2020-01-23|エシコン エルエルシーEthicon LLC|Surgical stapling system|
US11160551B2|2016-12-21|2021-11-02|Cilag Gmbh International|Articulatable surgical stapling instruments|
US11179155B2|2016-12-21|2021-11-23|Cilag Gmbh International|Anvil arrangements for surgical staplers|
US11191539B2|2016-12-21|2021-12-07|Cilag Gmbh International|Shaft assembly comprising a manually-operable retraction system for use with a motorized surgical instrument system|
US10675026B2|2016-12-21|2020-06-09|Ethicon Llc|Methods of stapling tissue|
US20180168618A1|2016-12-21|2018-06-21|Ethicon Endo-Surgery, Llc|Surgical stapling systems|
US10307170B2|2017-06-20|2019-06-04|Ethicon Llc|Method for closed loop control of motor velocity of a surgical stapling and cutting instrument|
US11266405B2|2017-06-27|2022-03-08|Cilag Gmbh International|Surgical anvil manufacturing methods|
US11141154B2|2017-06-27|2021-10-12|Cilag Gmbh International|Surgical end effectors and anvils|
US11259805B2|2017-06-28|2022-03-01|Cilag Gmbh International|Surgical instrument comprising firing member supports|
US20190000474A1|2017-06-28|2019-01-03|Ethicon Llc|Surgical instrument comprising selectively actuatable rotatable couplers|
US11246592B2|2017-06-28|2022-02-15|Cilag Gmbh International|Surgical instrument comprising an articulation system lockable to a frame|
US11229436B2|2017-10-30|2022-01-25|Cilag Gmbh International|Surgical system comprising a surgical tool and a surgical hub|
US11103268B2|2017-10-30|2021-08-31|Cilag Gmbh International|Surgical clip applier comprising adaptive firing control|
US11090075B2|2017-10-30|2021-08-17|Cilag Gmbh International|Articulation features for surgical end effector|
US11141160B2|2017-10-30|2021-10-12|Cilag Gmbh International|Clip applier comprising a motor controller|
US11134944B2|2017-10-30|2021-10-05|Cilag Gmbh International|Surgical stapler knife motion controls|
US11071543B2|2017-12-15|2021-07-27|Cilag Gmbh International|Surgical end effectors with clamping assemblies configured to increase jaw aperture ranges|
US11197670B2|2017-12-15|2021-12-14|Cilag Gmbh International|Surgical end effectors with pivotal jaws configured to touch at their respective distal ends when fully closed|
US11076853B2|2017-12-21|2021-08-03|Cilag Gmbh International|Systems and methods of displaying a knife position during transection for a surgical instrument|
US10743868B2|2017-12-21|2020-08-18|Ethicon Llc|Surgical instrument comprising a pivotable distal head|
US11056244B2|2017-12-28|2021-07-06|Cilag Gmbh International|Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks|
US10695081B2|2017-12-28|2020-06-30|Ethicon Llc|Controlling a surgical instrument according to sensed closure parameters|
US10966791B2|2017-12-28|2021-04-06|Ethicon Llc|Cloud-based medical analytics for medical facility segmented individualization of instrument function|
US20190201146A1|2017-12-28|2019-07-04|Ethicon Llc|Safety systems for smart powered surgical stapling|
US10758310B2|2017-12-28|2020-09-01|Ethicon Llc|Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices|
US11051876B2|2017-12-28|2021-07-06|Cilag Gmbh International|Surgical evacuation flow paths|
US20190201087A1|2017-12-28|2019-07-04|Ethicon Llc|Smoke evacuation system including a segmented control circuit for interactive surgical platform|
US20190205001A1|2017-12-28|2019-07-04|Ethicon Llc|Sterile field interactive control displays|
US20190206551A1|2017-12-28|2019-07-04|Ethicon Llc|Spatial awareness of surgical hubs in operating rooms|
US11096693B2|2017-12-28|2021-08-24|Cilag Gmbh International|Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing|
US11253315B2|2017-12-28|2022-02-22|Cilag Gmbh International|Increasing radio frequency to create pad-less monopolar loop|
US20190274716A1|2017-12-28|2019-09-12|Ethicon Llc|Determining the state of an ultrasonic end effector|
US11132462B2|2017-12-28|2021-09-28|Cilag Gmbh International|Data stripping method to interrogate patient records and create anonymized record|
US10987178B2|2017-12-28|2021-04-27|Ethicon Llc|Surgical hub control arrangements|
US11166772B2|2017-12-28|2021-11-09|Cilag Gmbh International|Surgical hub coordination of control and communication of operating room devices|
US11160605B2|2017-12-28|2021-11-02|Cilag Gmbh International|Surgical evacuation sensing and motor control|
US10944728B2|2017-12-28|2021-03-09|Ethicon Llc|Interactive surgical systems with encrypted communication capabilities|
US11147607B2|2017-12-28|2021-10-19|Cilag Gmbh International|Bipolar combination device that automatically adjusts pressure based on energy modality|
US11076921B2|2017-12-28|2021-08-03|Cilag Gmbh International|Adaptive control program updates for surgical hubs|
US11202570B2|2017-12-28|2021-12-21|Cilag Gmbh International|Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems|
US10849697B2|2017-12-28|2020-12-01|Ethicon Llc|Cloud interface for coupled surgical devices|
US10892995B2|2017-12-28|2021-01-12|Ethicon Llc|Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs|
US11266468B2|2017-12-28|2022-03-08|Cilag Gmbh International|Cooperative utilization of data derived from secondary sources by intelligent surgical hubs|
US11100631B2|2017-12-28|2021-08-24|Cilag Gmbh International|Use of laser light and red-green-blue coloration to determine properties of back scattered light|
US11179208B2|2017-12-28|2021-11-23|Cilag Gmbh International|Cloud-based medical analytics for security and authentication trends and reactive measures|
US11045591B2|2017-12-28|2021-06-29|Cilag Gmbh International|Dual in-series large and small droplet filters|
US11013563B2|2017-12-28|2021-05-25|Ethicon Llc|Drive arrangements for robot-assisted surgical platforms|
US11109866B2|2017-12-28|2021-09-07|Cilag Gmbh International|Method for circular stapler control algorithm adjustment based on situational awareness|
US11069012B2|2017-12-28|2021-07-20|Cilag Gmbh International|Interactive surgical systems with condition handling of devices and data capabilities|
US10892899B2|2017-12-28|2021-01-12|Ethicon Llc|Self describing data packets generated at an issuing instrument|
US11234756B2|2017-12-28|2022-02-01|Cilag Gmbh International|Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter|
US10932872B2|2017-12-28|2021-03-02|Ethicon Llc|Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set|
US10943454B2|2017-12-28|2021-03-09|Ethicon Llc|Detection and escalation of security responses of surgical instruments to increasing severity threats|
US11213359B2|2017-12-28|2022-01-04|Cilag Gmbh International|Controllers for robot-assisted surgical platforms|
US11257589B2|2017-12-28|2022-02-22|Cilag Gmbh International|Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes|
US11259830B2|2018-03-08|2022-03-01|Cilag Gmbh International|Methods for controlling temperature in ultrasonic device|
US10973520B2|2018-03-28|2021-04-13|Ethicon Llc|Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature|
US11090047B2|2018-03-28|2021-08-17|Cilag Gmbh International|Surgical instrument comprising an adaptive control system|
US11096688B2|2018-03-28|2021-08-24|Cilag Gmbh International|Rotary driven firing members with different anvil and channel engagement features|
US11219453B2|2018-03-28|2022-01-11|Cilag Gmbh International|Surgical stapling devices with cartridge compatible closure and firing lockout arrangements|
US11213294B2|2018-03-28|2022-01-04|Cilag Gmbh International|Surgical instrument comprising co-operating lockout features|
US11207067B2|2018-03-28|2021-12-28|Cilag Gmbh International|Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing|
US11166716B2|2018-03-28|2021-11-09|Cilag Gmbh International|Stapling instrument comprising a deactivatable lockout|
US11197668B2|2018-03-28|2021-12-14|Cilag Gmbh International|Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout|
US20190298350A1|2018-03-28|2019-10-03|Ethicon Llc|Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems|
US20200015902A1|2018-07-16|2020-01-16|Ethicon Llc|Force sensor through structured light deflection|
US11207065B2|2018-08-20|2021-12-28|Cilag Gmbh International|Method for fabricating surgical stapler anvils|
US11253256B2|2018-08-20|2022-02-22|Cilag Gmbh International|Articulatable motor powered surgical instruments with dedicated articulation motor arrangements|
US11259807B2|2019-02-19|2022-03-01|Cilag Gmbh International|Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device|
US11213361B2|2019-03-15|2022-01-04|Cilag Gmbh International|Robotic surgical systems with mechanisms for scaling surgical tool motion according to tissue proximity|
US11147553B2|2019-03-25|2021-10-19|Cilag Gmbh International|Firing drive arrangements for surgical systems|
US11172929B2|2019-03-25|2021-11-16|Cilag Gmbh International|Articulation drive arrangements for surgical systems|
US11147551B2|2019-03-25|2021-10-19|Cilag Gmbh International|Firing drive arrangements for surgical systems|
US11253254B2|2019-04-30|2022-02-22|Cilag Gmbh International|Shaft rotation actuator on a surgical instrument|
US11224497B2|2019-06-28|2022-01-18|Cilag Gmbh International|Surgical systems with multiple RFID tags|
US11259803B2|2019-06-28|2022-03-01|Cilag Gmbh International|Surgical stapling system having an information encryption protocol|
US11241235B2|2019-06-28|2022-02-08|Cilag Gmbh International|Method of using multiple RFID chips with a surgical assembly|
US11246678B2|2019-06-28|2022-02-15|Cilag Gmbh International|Surgical stapling system having a frangible RFID tag|
US11234698B2|2019-12-19|2022-02-01|Cilag Gmbh International|Stapling system comprising a clamp lockout and a firing lockout|
US20210196383A1|2019-12-30|2021-07-01|Ethicon Llc|Surgical systems correlating visualization data and powered surgical instrument data|
US20210196382A1|2019-12-30|2021-07-01|Ethicon Llc|Surgical system for overlaying surgical instrument data onto a virtual three dimensional construct of an organ|
US20210196386A1|2019-12-30|2021-07-01|Ethicon Llc|Analyzing surgical trends by a surgical system|
US20210196385A1|2019-12-30|2021-07-01|Ethicon Llc|Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto|
US20210196109A1|2019-12-30|2021-07-01|Ethicon Llc|Adaptive visualization by a surgical system|
US20210196423A1|2019-12-30|2021-07-01|Ethicon Llc|System and method for determining, adjusting, and managing resection margin about a subject tissue|
US20210196098A1|2019-12-30|2021-07-01|Ethicon Llc|Surgical system control based on multiple sensed parameters|
US20210196108A1|2019-12-30|2021-07-01|Ethicon Llc|Adaptive surgical system control according to surgical smoke cloud characteristics|
US11219501B2|2019-12-30|2022-01-11|Cilag Gmbh International|Visualization systems using structured light|
US20210196381A1|2019-12-30|2021-07-01|Ethicon Llc|Surgical systems for proposing and corroborating organ portion removals|
US20210196384A1|2019-12-30|2021-07-01|Ethicon Llc|Dynamic surgical visualization systems|
US20210199557A1|2019-12-30|2021-07-01|Ethicon Llc|Adaptive surgical system control according to surgical smoke particulate characteristics|
法律状态:
2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201762611339P| true| 2017-12-28|2017-12-28|
US201762611341P| true| 2017-12-28|2017-12-28|
US201762611340P| true| 2017-12-28|2017-12-28|
US62/611,339|2017-12-28|
US62/611,340|2017-12-28|
US62/611,341|2017-12-28|
US201862649291P| true| 2018-03-28|2018-03-28|
US62/649,291|2018-03-28|
US15/940,722|US20190200905A1|2017-12-28|2018-03-29|Characterization of tissue irregularities through the use of mono-chromatic light refractivity|
US15/940,722|2018-03-29|
PCT/IB2018/055697|WO2019130073A1|2017-12-28|2018-07-30|Characterization of tissue irregularities through the use of monochromatic light refractivity|
[返回顶部]